Sei sulla pagina 1di 443

Jaypee Gold Standard Mini Atlas Series®

History of Medicine

System requirement:
• Windows XP or above
• Power DVD player (Software)
• Windows Media Player version 10.0 or above (Software)
Accompanying Photo CD Rom is playable only in Computer
and not in DVD player.
Kindly wait for few seconds for Photo CD to autorun. If it does
not autorun then please do the following:
• Click on my computer
• Click the drive labelled JAYPEE and after opening the
drive, kindly double click the file Jaypee
Jaypee Gold Standard Mini Atlas Series®

History of Medicine

RK Marya MD PhD
Professor and Head
Unit of Physiology, School of Medicine
Faculty of Medicine and Health Sciences
Asian Institute of Medicine,
Science and Technology (AIMST)
Bedong, Kedah Darul Aman
Malaysia

JAYPEE BROTHERS MEDICAL PUBLISHERS (P) LTD


New Delhi • Ahmedabad • Bengaluru • Chennai • Hyderabad
Kochi • Kolkata • Lucknow • Mumbai • Nagpur • St Louis (USA)
Published by
Jitendar P Vij
Jaypee Brothers Medical Publishers (P) Ltd
Corporate Office
4838/24 Ansari Road, Daryaganj, New Delhi - 110002, India, Phone: +91-11-43574357
Registered Office
B-3 EMCA House, 23/23B Ansari Road, Daryaganj, New Delhi - 110 002, India
Phones: +91-11-23272143, +91-11-23272703, +91-11-23282021, +91-11-23245672
Rel: +91-11-32558559, Fax: +91-11-23276490, +91-11-23245683
e-mail: jaypee@jaypeebrothers.com, Visit our website: www.jaypeebrothers.com
Branches
 2/B, Akruti Society, Jodhpur Gam Road Satellite, Ahmedabad 380 015
Phones: +91-79-26926233, Rel: +91-79-32988717, Fax: +91-79-26927094
e-mail: ahmedabad@jaypeebrothers.com
 202 Batavia Chambers, 8 Kumara Krupa Road, Kumara Park East, Bengaluru 560 001
Phones: +91-80-22285971, +91-80-22382956, 91-80-22372664, Rel: +91-80-32714073
Fax: +91-80-22281761 e-mail: bangalore@jaypeebrothers.com
 282 IIIrd Floor, Khaleel Shirazi Estate, Fountain Plaza, Pantheon Road, Chennai 600 008
Phones: +91-44-28193265, +91-44-28194897, Rel: +91-44-32972089, Fax: +91-44-28193231
e-mail: chennai@jaypeebrothers.com
 4-2-1067/1-3, 1st Floor, Balaji Building, Ramkote Cross Road, Hyderabad 500 095
Phones: +91-40-66610020, +91-40-24758498, Rel: +91-40-32940929, Fax:+91-40-24758499
e-mail: hyderabad@jaypeebrothers.com
 No. 41/3098, B & B1, Kuruvi Building, St. Vincent Road, Kochi 682 018, Kerala,
Phones: +91-484-4036109, +91-484-2395739, +91-484-2395740, e-mail: kochi@jaypeebrothers.com
 1-A Indian Mirror Street, Wellington Square, Kolkata 700 013
Phones: +91-33-22651926, +91-33-22276404, +91-33-22276415, Rel: +91-33-32901926
Fax: +91-33-22656075, e-mail: kolkata@jaypeebrothers.com
 Lekhraj Market III, B-2, Sector-4, Faizabad Road, Indira Nagar, Lucknow 226 016
Phones: +91-522-3040553, +91-522-3040554, e-mail: lucknow@jaypeebrothers.com
 106 Amit Industrial Estate, 61 Dr SS Rao Road, Near MGM Hospital, Parel, Mumbai 400 012
Phones: +91-22-24124863, +91-22-24104532, Rel: +91-22-32926896, Fax: +91-22-24160828
e-mail: mumbai@jaypeebrothers.com
 “KAMALPUSHPA” 38, Reshimbag, Opp. Mohota Science College, Umred Road, Nagpur 440 009 (MS)
Phone: Rel: +91-712-3245220, Fax: +91-712-2704275, e-mail: nagpur@jaypeebrothers.com
USA Office
1745, Pheasant Run Drive, Maryland Heights (Missouri), MO 63043, USA,
Ph: 001-636-6279734, e-mail: jaypee@jaypeebrothers.com, anjulav@jaypeebrothers.com
History of Medicine
© 2009, RK Marya
All rights reserved. No part of this publication and Photo CD ROM should be reproduced, stored in a retrieval
system, or transmitted in any form or by any means: electronic, mechanical, photocopying, recording, or
otherwise, without the prior written permission of the author and the publisher.

This book has been published in good faith that the material provided by author is original. Every effort
is made to ensure accuracy of material, but the publisher, printer and author will not be held responsible
for any inadvertent error(s). In case of any dispute, all legal matters are to be settled under Delhi jurisdiction
only.

First Edition: 2009


ISBN 978-81-8448-541-7
Typeset at JPBMP typesetting unit
Printed at Ajanta Offset & Packagings Ltd., New Delhi
Preface

What we know about medicine today is the result of discoveries made


by men and women over thousands of years. History of medicine is not
just a continuum of scientific achievements. It is deeply influenced by
the personalities of the men and women who made these breakthroughs.
This field’s greatest practitioners were, like the rest of us, human beings
with flaws and weaknesses. Risk-takers and rebels, they frequently
challenged conventional wisdom and stirred storms of controversy. Some
were ridiculed, even reviled, in their own time. Yet these same people
made some of history’s greatest medical discoveries and changed the
path of medicine.
“It has always been the fate of those who have improved the arts
and sciences by their discoveries, to be beset by envy, malice, hatred,
detraction, and calumny.” Leopold Auenbrugger (1722-1809)
This book traces the development of medicine from the ancient
treatments by charms and spells or going on a pilgrimage or even by a
touch by the King or the Queen to the modern transplant and robotic
surgery. In short, biographic sketches highlight the role of eminent
physicians, each taking a small step to the modernization of medicine.
Many of the discoveries discussed in this book would seem to be a
result of serendipity or a chance discovery. Actually, behind these “chance
discoveries” lie decades of hard work. Millions must have seen an apple
falling from a tree but it was left to Newton to interpret its significance.
A seed grows into a sapling only if it falls on a fertile and prepared
vi | History of Medicine

ground. As one Nobel laureate, Charles Nicolle commented: “Fortune


favors only those who know how to court her.”
While writing this book, the aim has been to make the text highly
informative yet short and interesting. The stories and legends woven
around the historical figures provide flesh and blood to the bone dry
history of medicine. The book is expected to find favor with all the
medical students as well as their teachers and even laymen.
History of medicine is becoming a part of undergraduate medical
curriculum in most of the medical schools in the West. Why should
modern physicians read history of medicine? This question is best
answered by the following quotation by an eminent 19th century surgeon:
“Only a man who is familiar with the art and science of the past is
competent to aid in the progress in its future.”
Theodor Billroth (1829-1894)
I am grateful to Shri Jitendar P Vij (Chairman and Managing Director)
and Mr Tarun Duneja (Director–Publishing) of Jaypee Brothers Medical
Publishers (P) Ltd, New Delhi, for their wholehearted cooperation in
the publication of this book.
R K Marya
(E-Mail: rkumarmarya@yahoo.com)
Contents

PART ONE: Ancient Medicine


1. Ancient Indian Medicine ......................................................... 2
2. Ancient Chinese Medicine ...................................................... 7
3. Mythological Gods and Goddesses in Ancient Western
Medicine ................................................................................. 11
4. Hippocrates ............................................................................ 14
5. Aristotle ................................................................................. 19
6. Claudius Galen ....................................................................... 21

PART TWO: Medieval Medicine


7. The Dark Ages in Western Medicine ................................... 26
8. The Black Death .................................................................... 31
9. The Golden Era of Arabic Medicine .................................... 32

PART THREE: 14th to 18th Century Medicine


10. Renaissance of Medicine ....................................................... 38
11. Bloodletting and Purgatives—Panacea for
All Inflammations .................................................................. 39
12. 18th Century Europe—An Export House of Diseases ...... 41
13. Uroscopy—The Ultimate Diagnostic
Investigation ........................................................................... 43
14. Andreas Vesalius .................................................................... 44
15. Leonardo Da Vinci ................................................................. 47
viii | History of Medicine

16. Paracelsus ............................................................................... 49


17. Gabriele Falloppio ................................................................. 52
18. Ambroise Pare ........................................................................ 54
19. William Harvey ...................................................................... 55
20. Thomas Sydenham ................................................................ 59
21. Marcello Malpighi ................................................................. 62
22. Anton Leeuwenhoek
“The First Microbiologist” ................................................... 63
23. Stephen Hales ........................................................................ 66
24. Ingenious Investigations on Digestive Physiology ............ 68
25. Giovanni Morgagni ................................................................ 69
26. Leopold Auenbrugger ............................................................ 72
27. John Hunter ............................................................................ 73
28. Role of Criminals in the Development of Anatomy ........... 75
29. Matthew Baillie ..................................................................... 76
30. Joseph Priestley .................................................................... 77
31. William Withering .................................................................. 79
32. Edward Jenner ........................................................................ 81

PART FOUR: 19th Century Medicine


33. Development of Thermometer ............................................. 86
34. Rene Laennec ......................................................................... 88
35. Advances in Physiology of Digestion ................................. 90
36. François Magendie ................................................................. 93
37. Charles Bell ............................................................................ 95
38. Robert Graves ........................................................................ 97
39. Rudolf Virchow ...................................................................... 98
40. Julius Cohnheim .................................................................. 100
41. Alois Alzheimer ................................................................... 102
Contents | ix

42. Franz Nissl ........................................................................... 104


43. Guillaume Duchenne ........................................................... 105
44. Carl Ludwig .......................................................................... 107
45. Jean-Martin Charcot ........................................................... 109
46. Joseph Babinski ................................................................... 113
47. Sigmund Freud ..................................................................... 115
48. Claude Bernard ..................................................................... 118
49. Brown-Séquard .................................................................... 122
50. Charles Darwin .................................................................... 124
51. Health Care in 19th Century London ................................ 126
52. Historical Image of Women as Patients ............................. 130
53. Miasma Theory of Disease ................................................ 132
54. Ignaz Semmelweis ................................................................ 134
55. Louis Pasteur ....................................................................... 136
56. Sir Joseph Lister .................................................................. 140
57. Robert Koch ......................................................................... 143
58. Paul Ehrlich .......................................................................... 147
59. Florence Nightingale ............................................................ 149
60. The Red Cross Movement .................................................. 153
61. The First Woman Doctors in the USA and England ......... 155
62. Discovery of Anesthesia ..................................................... 160
63. Theodor Billroth .................................................................. 162
64. William Stewart Halsted ..................................................... 165
65. Medical Missionaries .......................................................... 167
66. History of Western Medical Education in India ............... 169
67. Dr Jivraj Mehta ................................................................... 172
68. Tropical Medicine—A Byproduct of Imperialism ........... 174
69. The Antipyretic Era ............................................................ 175
x | History of Medicine

PART FIVE: 20th Century Medicine


70. The Nobel Prize ................................................................... 180
71. Nobel Prize Winners in Physiology or Medicine ............. 182
72. American Research Workers in Medicine:
From Zero to Heroes. .......................................................... 191
73. The Brown Dog Affair ........................................................ 193
74. Wilhelm Conrad Roentgen .................................................. 195
75. The Pioneer Neurohistologists ........................................... 198
76. Marie Curie .......................................................................... 200
77. Invention of Sphygmomanometer ...................................... 203
78. Harvey Williams Cushing ................................................... 205
79. Ivan Petrovich Pavlov ......................................................... 208
80. History of Blood Transfusion ............................................ 210
81. 20th Century Businessmen-Surgeons ................................ 212
82. The Golden Era of Surgery ................................................. 213
83. Ronald Ross ......................................................................... 214
84. From Mal’aria to Malaria ................................................... 215
85. Influenza Pandemic 1918 .................................................... 218
86. August Krogh ....................................................................... 220
87. JS Haldane ............................................................................ 222
88. Gerhard Domagk: Discovery of Sulphonamides .............. 225
89. Alexander Fleming: The Beginning of Antibiotic Era ....... 226
90. Selman Waksman .................................................................. 229
91. History of Tuberculosis ...................................................... 231
92. Rustom Jal Vakil .................................................................. 233
93. Role of Battle-Fields in Medical Research ........................ 236
94. Discovery of Vitamins ........................................................ 238
95. Discovery of Endocrines ..................................................... 242
Contents | xi

96. Frederick Banting: Discovery of Insulin ........................... 246


97. Walter Cannon ...................................................................... 248
98. Walter Rudolf Hess ............................................................. 250
99. Egas Moniz .......................................................................... 251
100. Walter Freeman .................................................................... 253
101. Treatment of Psychological Disorders .............................. 256
102. Otto Loewi ........................................................................... 262
103. Henry Dale ........................................................................... 263
104. History of Electrodiagnostic Techniques .......................... 265
105. Werner Forssmann: The First Cardiac Catheterization ... 270
106. Pioneers in Heart Surgery ................................................... 272
107. Christiaan Barnard ............................................................... 274
108. Discovery of Anticoagulants .............................................. 276
109. Gertrude Elion ...................................................................... 279
110. The Thalidomide Disaster .................................................. 280
111. History of Tobacco Smoking and Lung Cancer ................ 282
112. History of Cancer ................................................................ 285
113. Hans Selye ............................................................................ 287
114. Some Famous Neurophysiologists .................................... 289
115. Willem J Kolff ...................................................................... 292
116. History of Organ Transplantation ..................................... 296
117. Joseph E Murray ................................................................. 298
118. Wilder Penfield .................................................................... 300
119. Eric Richard Kandel ............................................................. 301
120. Nitric Oxide: From Menace to Marvel of the Decade ..... 302
121. Test Tube Babies (In Vitro Fertilization) .......................... 304
122. Barefoot Doctors of China ................................................. 306
123. V Ramalingaswami ............................................................... 307
xii | History of Medicine

124. AS Paintal ............................................................................. 308


125. Instrumental Error Leads to an Important Discovery ..... 310
126. Alexis Carrel ......................................................................... 311
127. The Eugenic Movement ...................................................... 312
128. History of Alcohol .............................................................. 315
129. Truth Serum ......................................................................... 317
130. Richard Axel ......................................................................... 319
131. When Patients Refused to Terminate A Failed
Drug Trial ............................................................................. 321
132. Discovery of Helicobacter Pylori ...................................... 322
133. The Population Explosion: An Impact of Better
Health Care ........................................................................... 324
134. Dr William H Masters Learns Sexology from
Prostitutes ............................................................................ 326
135. Alternative Medicine ........................................................... 327
136. The Ig Nobel Prize .............................................................. 335

PART SIX: History of Development of (Specialities)


137. History of Immunology ...................................................... 340
138. Emil Behring ......................................................................... 341
139. Eli Metchnikoff ................................................................... 342
140. Charles Richet and other Nobel Laureates in
Immunology ......................................................................... 344
141. Discovery of T- and B-Lymphocytes ............................... 347
142. History of Ophthalmology — Ancient Concepts of
Anatomy of the Eye ............................................................ 348
143. Thomas Young ..................................................................... 349
144. Hermann Helmholtz ............................................................ 351
145. Frans Donders ...................................................................... 353
Contents | xiii

146. von Graefe ............................................................................ 354


147. Herman Snellen .................................................................... 356
148. Cataract Surgery .................................................................. 357
149. Harold Ridley ....................................................................... 359
150. History of Otorhinolaryngology—History of
Laryngoscope ....................................................................... 360
151. Morell Mackenzie ............................................................... 361
152. Robert Bárány ...................................................................... 363
153. Georg von Békésy ............................................................... 364
154. History of Orthopedics—Nicolas Andry:
“The Birth of Orthopedics” ............................................... 366
155. Percival Pott ......................................................................... 367
156. Antonius Mathyjsen ........................................................... 369
157. Hugh Owen Thomas ............................................................ 370
158. Robert Jones and Watson-Jones ......................................... 371
159. Kuntscher and Ilizarov ........................................................ 373
160. John Charnley Joint Replacement Therapy ..................... 374
161. PK Sethi ................................................................................ 377
162. History of Obstetrics and Gynecology—Childbirth ....... 378
163. Cesarean Section .................................................................. 383
164. Abortion ............................................................................... 385
165. Contraceptives ..................................................................... 387
166. Gynecology .......................................................................... 392
167. Baby Incubators .................................................................. 395
168. Wet Nursing and Artificial Feeding .................................... 398

PART SEVEN: Medical Marvels of 21st Century


169. Insulin Pump Therapy ........................................................ 402
xiv | History of Medicine

170. Telemedicine ......................................................................... 403


171. Robotic Surgery ................................................................... 404
172. Artificial Heart ..................................................................... 404
173. Fetal Surgery ........................................................................ 405
174. Gene Therapy ...................................................................... 406
175. Scorpion Venom—A Diagnostic Tool ................................ 406

References ............................................................................. 409

Index ...................................................................................... 427


2 | History of Medicine

ANCIENT INDIAN MEDICINE


Some experts consider Ayurveda the world’s most ancient system of
medicine. Ayurvedic medicine finds mention in the ancient Chinese
medical texts, as well as, in the writings of Hippocrates, the founder of
modern Western medicine. Ayurvedic herbs native to India have been
excavated from Egyptian tombs dated many thousand years.
Ayurveda, the Indian (Hindu) art of healing is documented in the
four books of Hindu spirituality known as the Vedas, written between
1000 BC and 3000 BC. The earlier scripts were written on perishable
materials such as Taalpatra and Bhojpatra. The scripts were later written
on stone or copper sheets. This system views health as harmony between
body, mind, and spirit. It reiterates that health and disease are not
predetermined and life can be prolonged by human effort. Ayurveda
speaks of eight branches of medicine: internal medicine, surgery including
anatomy, ear, nose and throat diseases, pediatrics, toxicology, science of
rejuvenation and science of fertility. About 1500 BC, Ayurveda came to
be divided into two schools. Rishi Atreya’s school represented the
physicians, whereas Rishi Dhanvantari’s school represented the
surgeons. In this period, the physicians were mainly drawn from the
Brahmins, highest in the hierarchy of Hindu caste system, who refused
to touch blood, wounds, injuries or the dead. The art of surgery was left
to Kshatrias or Vaishyas, i.e. those on the lower scale in the caste
system.
Charak (around 600 BC) and Sushruta (around 400 BC) culled
the materials about health care from the Vedas and compiled into two
treatises called Charak Samhita (compilation) and Sushrut Samhita,
respectively. These books were widely translated and influenced the
Greek and Roman medicine of that era and the Arabic medicine in the
Middle Ages. Scholars from China, Tibet, Egypt, Persia, Greece, and
Rome flocked to India to learn the Indian system of medicine.
Ancient Medicine | 3

Charak (Fig. 1.1) belonged to the Atreya school of Ayurveda. He


wrote extensively about the hospital system. The instructions included
location, equipment, food supply, medical and other personnel and even
the necessity of entertainment of the patients. Charak Samhita deals
with subjects such as physiology, anatomy, etiology, pathogenesis,
symptoms and signs of various diseases as well as preventive measures.
For the skeptical modern person, who wonders if this ancient wisdom
can be believed, one has to read Charak’s month by month description
of development of fetus in the womb. The book also mentions qualities
and actions of about 10,000 herbal plants. Charak has described diagnosis
and treatment of diseases like diabetes, tuberculosis and heart diseases.
Sushruta (Fig. 1.2) from the Dhanvantari school of Ayurveda,
brought the ancient Indian surgery to its climax. His book, Sushrut
Samhita provides minute details regarding preoperative and
postoperative care, diet, surgical techniques as well as indications,

Fig. 1.1: Charak Fig. 1.2: Sushruta


4 | History of Medicine

contraindications and complications of various surgical procedures. The


book provides a list of about 125 surgical instruments including their
specifications. Various operations described include laparotomy,
intestinal repair, and surgery for intestinal obstruction, Cesarean section
and amputation, etc. Surgical procedures for removal of vesical calculi
and for treatment of anal fissures, fractures, hernias, and cataract are
also given. Many details of the operations given in the book are such
that can be given only by a practicing surgeon, making it certain that the
elaborate surgical techniques were a reality in Sushruta’s times. During
the training of a surgeon, the art of making incisions on the abdomen was
practiced on Pusphlala gourd, cucumber or watermelon. Excision was
taught on a full water bag. Venesection was taught on dead animals.
In Sushrut Samhita, an ingenious method for suturing the severed
ends of the intestine is mentioned. The cut ends of the intestine were
apposed to each other and big black ants, collected specifically for this
purpose, were made to bite the apposed ends and their heads were
severed off when their pincers had closed. Thus, the pincers remained
‘in situ’ due to rigor mortis, retaining the cut ends of the intestine in
apposition for a few days. Thus the ants served the purpose of the
catgut sutures used nowadays. This account is found in 8th century
Arabic translation of Sushrut Samhita called “Kitab-Sushrud.”
The first translation of Sushrut Samhita was ordered by the Caliph
Mansur (753-774 AD) who brought Hindu scholars and their books to
Baghdad. The Caliph Harun (786-808 AD) appointed Hindu physicians
in Baghdad hospitals and ordered further translations of Indian books
on medicine, pharmacology, toxicology, astronomy, and other subjects
into Arabic. Alberuni, who was a member of the court of Mahmud of
Ghazni (997-1030 AD) mentions the Arabic translation of Charak
Samhita in his book, although complains of its inaccuracies. The first
English translation of Sushrut Samhita was done by Kaviraj Kunja Lal
Bhishagratna, in three volumes, in 1907 at Calcutta.
Ancient Medicine | 5

Soon after Sushruta’s times, the surgical branch of Ayurveda began


to fade away. The causes of the decline are not clear but one cause
mentioned is that after Kalinga War, Emperor Ashoka (304-232 BC),
influenced by Buddhist teachings, banned any bloodshed in the kingdom
in 250 BC. Therefore, Ayurvedic practitioners were allowed to work as
physicians (Vaidyas) only. Many 17th AD foreign visitors have
commented on the virtual nonexistence of surgeries in India. The
important role of Ayurvedic Vaidyas in the health care of the Indian
population is also recorded in the accounts of Chandragupta Maurya
(375-414 AD).
Plastic and reconstructive surgery had a special place in ancient
Indian surgery. The procedure for reconstructive rhinoplasty (Fig.
1.3) is described in detail in Sushruta Samhita. Such an operation was
needed because cutting of the nose (or pinna of the ear) was official
punishment for serious crimes especially adultery. In 1793, such
operations were witnessed by Thomas Cruso and James Findlay, senior
British surgeons in Bombay Presidency, and
recorded in detail. According to them, they
saw the procedure done on five Maratha
employees of the East India Company
Army, who had been captured by the forces
of Tipu Sultan and had their nose and one
pinna cut off. The surgeons have given
description of the procedure as well as
illustrative diagrams. The technique given in
this report was similar, but not identical to,
that given in Sushrut Samhita. In this
operation, the skin flap was taken from the
forehead, whereas Sushruta recommends
skin flap from the cheek. Publication of this
Fig. 1.3: Rhinoplasty
6 | History of Medicine

report led many British and European surgeons to undertake such


surgeries.
In the early stages of its conception, the system of Ayurvedic
medicine was orally transferred via the Gurukul system until the written
script came into existence. In the Ayurvedic system, at the beginning of
the course, the Guru gave a solemn address where he directed the students
of Ayurveda to lead a life of chastity, honesty, and vegetarianism. The
student was to strive with all his being to heal the sick. He (females were
not given the training) was not to betray patients for his own advantage.
He was required to dress modestly and avoid drugs and alcohol. He was
to be collected and self-controlled, measured in speech all the times. He
was to constantly try to improve his knowledge and technical skill. In
the home of the patient, he was to be courteous and modest, directing all
his attention to the patient’s welfare. He was not to divulge any
knowledge about the patient and his family. If the patient was incurable,
he was to keep this information to himself, if it was likely to harm the
patient or his family. The training period appears to have lasted seven
years. At the end of the course, the student had to pass an examination.
The growth and development of Ayurveda never took off from
where it was left by Charaka and Sushruta. The problem worsened with
the repeated invasions from outside the country and the establishment
of Muslim rule. The teaching and training of Ayurveda was banned. The
Muslim rulers imposed the Arabian system of medicine, called Unani
medicine. Later, the British rulers also did not encourage Ayurveda.
They only believed in Western system of medicine. Initially, when acute
lack of British medical doctors in the British Armed forces was felt,
Indians trained in Ayurvedic medicine were given additional training in
Western medicine to produce assistant surgeons for support of the full
(British) surgeons attached to the British army. By 1833, it was ordered
that medical training in British India would be in English only.
Ancient Medicine | 7

High-caste Brahmin Vaidyas were not willing to touch the cadavers.


Therefore, Ayurveda gradually became side-lined.
An important cause of the lack of progress in Indian system of
medicine seems to lie in the very origin of the medical system. Since
these medical treatises have been culled from the holy Vedas, no Hindu
could ever challenge its beliefs and practices. Moreover, challenge to the
authority of seniors, especially on religious matters, has not been a
strong point of Hindu psychology.
The official patronage of Ayurveda was revived only after Indian
independence in 1947. The Ayurvedic schools were officially recognized.
Today, over 100 Ayurvedic colleges in India grant degrees after a five-
year teaching program. Over 300,000 Ayurvedic physicians belong to
the All India Ayurvedic Congress, making it the largest medical
organization in the world. Ayurvedic practitioners have been appointed
as Honorary Physician to the President of India. Every year, on the
occasion of Dhanvantari Jayanti, a prestigious Dhanvantari award is
conferred on a famous personality in Medical sciences, including
Ayurveda. Today, Kerala is one of the many states in India that promote
research and practices of Ayurveda. This has been attributed to its well-
established Ayurveda hospitals (Vaidya-shalas) and Ayurveda pharma-
ceutical companies.

ANCIENT CHINESE MEDICINE


The oldest classic in the traditional Chinese medicine is cited as “Neijing
Suwen” (Basic Questions of Internal Medicine). It is supposed to have
been a result of a dialogue between the King Huan Di (Yellow Emperor,
2696–2598 BC) and his minister. This treatise discovered in the year
265 AD, explains the fundamental roots of traditional Chinese medicine.
Bian Que 500 BC (Fig. 1.4) is another famous name in the ancient
8 | History of Medicine

Chinese medicine. He was reputed to be


an excellent diagnostician and an expert
in pulse taking and acupuncture therapy.
One day, Bian Que heard that the Queen
of Hu state had died. He felt very sad
and went to the palace to pay his
respect. He found that the inner thighs
of the Queen were not yet very cold and
he diagnosed her as “fake death.” (As to
why he touched the inner thighs of the
supposedly dead queen is not explained).
Under his care, the Queen recovered
fully. Thus he got the title Bian Que Fig. 1.4: Bian Que
“(The doctor who brings back from
death)”. From that day, Bian Que became a doctor of the level of a God.
Zhang Zhongjing is the most famous of China’s ancient herbal
doctors. When Zhang Zhongjing was 50 years old, an epidemic of plague
swept over China in which about two-thirds of the population was
infected. A saddened Zhongjing decided to devote the rest of his life to
find a solution to the human diseases. After several decades, he finished
his work “Shang Za Bing Lun,” which became the cornerstone in the
Chinese medicine. The book described the formulas of 1000 herbal
medicines, some of which are in use even today.
The five element theory forms the basis of the entire traditional
Chinese medicine. According to this theory, everything in the universe,
including our health, is governed by five natural elements: wood, fire,
earth, metal and water. The wood element is associated with the liver
and gallbladder; the fire with the heart and small intestine; the earth with
stomach and spleen; metal with lungs and large intestine; and water with
the kidney and bladder.
Ancient Medicine | 9

In addition to the five elements, there are eight guiding principles:


Yin/Yang, cold /heat, deficiency/excess and interior/exterior. Heat / cold
refer to the overall energy of the patient. A patient with a slow metabolism,
chills, pale skin, and low grade fever would be said to have cold condition.
A patient with hightened metabolism, high fevers and flushed complexion
is said to have hot condition. Interior / exterior refers to the location of
the patient’s problem. Exterior condition refers to the disorders of hair,
skin, muscles, joints, peripheral nerves and blood vessels. Interior
condition refers to the disorders of the organs deep in the body. All acute
disorders are considered as excess conditions, whereas chronic ailments
are considered deficiency conditions. Yin /Yang are the generalizations
of the above principles. A condition is categorized as dominantly Yin or
Yang. Yin energy is associated with cold, female energy and represents
the solid organs. Yang is associated with hot, male energy and represents
the hollow organs.
In the Chinese system of medicine the examination of a patient
starts with a detailed interview to seek information about the complaints,
quality of sleep, appetite, preferred foods and stress (Fig. 1.5). The
physician has a special interest in the history of sense of smell. In the
five element theory, each element has a corresponding smell with it.

Fig. 1.5: Chinese doctor


10 | History of Medicine

Next the pulse is examined in detail. Radial pulse is examined at six sites
in each wrist, three superficial and three deep. All the internal organs of
the body are represented in these 12 pulses. Taking the pulse is the
most important diagnostic tool in the ancient Chinese medicine. It requires
years of training to learn this art. Tongue of the patient is also given
special attention, since it is believed to be a strong barometer of the
patient’s health. Different parts of the tongue are believed to be affected
by disorders of different organs of the body.
A unique feature of the ancient Chinese medicine is the meridian
system (Fig. 1.6). Chinese doctors view the body as regulated by a
network of energy pathways called meridians that link and balance the
various organs. These meridians connect the internal organs with the
exterior of the body and connect the person to the environment.
It would be a fruitless effort to interpret the ancient system of
Chinese medicine in terms of allopathic medicine. In a major part of rural
China, this is the only form of medicine practiced even today and called
the Traditional Chinese Medicine.

Fig 1.6: Chinese meridians


Ancient Medicine | 11

MYTHOLOGICAL GODS AND GODDESSES


IN ANCIENT WESTERN MEDICINE
As all other ancient civilizations, ancient Greeks and Romans considered
illness to be caused by anger or curse of the Gods for their misdeeds and
sought cure by worshipping Gods and Goddesses in the temples. Apollo
(Fig. 1.7) was most widely revered of the Greek and Roman Gods.
Apollo is usually depicted as the perfection of youth and beauty. He was
believed to be the God of prophecy, medicine, music, poetry, and archery.
Apollo had intrinsically dual nature; on one hand he could bring good
fortune and avert evil; on the other hand, he could inflict disaster. His
son, Asclepius was more widely worshipped as the God of medicine.
Asclepius (Fig. 1.8) was a Greek God of medicine and healing.
According to the Greek mythology, Asclepius’s mother was unfaithful

Fig. 1.7: Apollo Fig. 1.8: Asclepius


12 | History of Medicine

to Apollo, who, in anger, cast her to the fire. As her body began to burn,
Apollo felt sorrow for his unborn son, and snatched the child Asclepius
from his mother’s corpse. Apollo handed over the infant to Centaur
Chiron, who became his tutor and mentor. Chiron taught Asclepius the
art of healing with the help of drugs and surgery. With these gifts,
Asclepius exceeded the fringes of human knowledge and was able to give
life even to the dead. This angered Zeus (the Supreme God) who slew
Asclepius with a thunderbolt. Later, realizing the good Asclepius had
brought to humans, Zeus made him into a God. Thus, Asclepius began
to be worshipped as the God of medicine throughout Greece and later in
Roman Empire. A large number of temples were built in those countries
which were used by priests to offer cure for various ailments. Asclepius’s
two daughters, Hygeia, and Panacea were also worshipped along with
Asclepius, as goddess of welfare and prevention of disease and goddess
of healing, respectively.
The temples of Asclepius were always built in healthy surroundings
with fine scenery and natural springs. The sick and invalids would drink
and bathe in the spring water and then enter the shrine and spend the
night in a dormitory, with nonpoisonous snakes all around them. During
the night, they were supposedly visited by the God Asclepius in their
dreams and offered clues regarding the cure.
Asclepius has always been depicted standing with a long staff,
around which is entwined a large snake. The staff symbolizes the tree of
life.The coiling snake represents healing power because it was
erroneously believed to be immune to illness and disease.
From early 16th century onwards, the staff of Asclepius (Fig. 1.9)
(a single snake entwined around a staff) has been used as a symbol of
medical profession. According to some scholars, in this symbol, a worm
is coiled around a stick. In the ancient times, the infection by parasitic
filarial worms was very common. The worm crawled around the victim’s
body, just under the skin. The physicians treated this infection by
Ancient Medicine | 13

cutting a slit in the patient’s skin, just in front of the worm’s path. As
the worm crawled out of the cut, the physician carefully wound the
worm around a stick, until the entire parasite had been removed. It is
believed that because this type of infection was so common, physicians
advertised their services by displaying the worm on a stick. However,
the logos of most of the medical associations including that of the World
Health Organization use a single serpent (not a worm) entwined around
a staff in memory of the staff of Asclepius.
Some medical organizations, especially in the USA, use Caduceus
of Hermes (Fig. 1.10)—a short rod entwined by two snakes and topped
by a pair of wings as their logo. Caduceus of Hermes has nothing to do
with medical profession. The caduceus or the magic wand of the Greek
God Hermes represents the God of invention, conductor of the dead and

Fig. 1.9: Staff of Asclepius: Fig. 1.10: Caduceus of Hermes


Symbol of modern medicine
14 | History of Medicine

protector of the merchants and thieves. A major reason for the current
popularity of the caduceus as a medical symbol was its ill-informed
official adoption as the insignia for the Medical Department of the
United States Army in 1902. According to a wag, in view of the greedy
attitude of most of the doctors, caduceus is the most appropriate logo
for the medical profession today.

HIPPOCRATES
Hippocrates (460-380 BC) (Fig. 1.11)
was a Greek physician and professional
trainer of medical students. He is called
the father of modern (Western)
medicine, because he rejected the views
of his time that considered illness to be
caused by disfavor of the Gods or by
possession of evil spirits. He rejected
the superstitions and held the belief that
illness had a physical and rational
explanation. He based his medical
practice on the observations of human
body in health and disease. Hippocrates
believed that the natural healing process
Fig. 1.11: Hippocrates
is accelerated by rest, a good diet, fresh
air, and cleanliness. He also observed that there were individual differences
in the severity of symptoms, and that some individuals were able to
cope with an illness better than others. Probably, the most important
contribution in the development of medical science was his
recommendation that the physicians should record their findings and
their medical methods so that the records may be passed on to the
coming generations. He, and his followers wrote about 60 treatises, now
Ancient Medicine | 15

collectively known as Hippocratic Collection (Hippocratic corpus).


His faith in scientific methodology is illustrated by the statement: “there
are in fact two things : science and opinion; the former begets knowledge,
the latter ignorance.”

HIPPOCRATIC OATH
Hippocrates is best remembered for the oath named after him, the spirit
of which is as applicable today, as was 2500 years ago. Till date, no
other person has put forward medical ethics better than advocated by
him. A slightly abridged version of the Hippocratic Oath is given below.
• I swear by Apollo the healer, by Asclepius, by Hygeia, by Panacea,
and all the Gods and Goddesses to be my witness, that I will fulfil
this oath to the best of my ability and judgment.
• I will look upon him who shall have taught me this art even as one
of my own parents. I will share my substance with him and I will
supply him his necessities if he be in need.
• The regimen I adopt shall be for the benefit of the patient, according
to the best of my ability and judgment and not for his hurt or for any
wrong.
• I will give no deadly drug to any, though it may be asked of me.
• I will not aid a woman to procure abortion.
• I will not use a knife even on sufferers from stones but will withdraw
in favor of such men as engaged in such work.
• What so ever house I enter, there will I go for benefit of the sick,
refraining from any act of seduction of a male or a female.
• In attendance on the sick, or even apart from that, what so ever
things I see or hear, I will keep silence, counting such things as to be
a sacred secret.
• Pure and holy I shall keep my life and my art.
16 | History of Medicine

The Hippocratic Oath has been regarded as a message of timeless


validity. However, historians have observed violation of almost every
injunction of the Hippocratic Oath by the medical practitioner throughout
the history of medicine.

HIPPOCRATIC COLLECTION
The greatness of Hippocrates lies not as much in the oath named after
him, as in the number of books on medical practice written by him, and
his followers and collectively called Hippocratic collection. These books
contain description of actual cases recorded in such a clear, short, and
succinct manner that could not be surpassed in the next two millen-
niums. For example complications of prolonged continuous fever
(? typhoid) are described as oliguria, coldness of extremities, incoherent
talk, breathing rare, and large with long intervals and again hurried (Cheyne
Stokes respiration) and finally loss of voice and death. The diagnostic
signs of pleurisy are described as a splashing sound (now named as
Hippocratic succussion) or a creak like that of leather (now called pleural
rub). In a book entitled prognostics, the signs of impending death are so
vividly described that they are still described in the textbooks of medicine
as Hippocratic facies. The titles of some of the books of Hippocratic
collection, given below, give some idea of the range of their contents:
1. Textbook for Physicians.
2. Textbook for Laymen.
3. Lecture Essays for Medical Students and Novices.
4. Epidemics.
5. Collection of Material for Research.
6. On the Sacred Disease (detailed description of patients with hysteria,
anxiety disorders and mental depression).
Ancient Medicine | 17

HIPPOCRATIC APHORISMS
Aphorism literally means a definition. The term is usually used to
describe a principle expressed tersely, in a few words, or a general truth,
conveyed in a short pithy sentence, in such a way that once heard it is
unlikely to be forgotten. Following are some of the well known
Hippocratic aphorisms:
1. Life is short, art is long, opportunity fugitive, experi-menting
dangerous, reasoning difficult.
2. Walking is man’s best medicine.
3. Idleness and lack of occupation drag a person towards evil.
4. Everything in excess is opposed to nature.
5. Weariness without apparent cause indicate disease
(tuberculosis?).
6. Food or drink slightly inferior in itself, but more palatable, should
be preferred to that is less palatable, though better in itself.
7. Dry season is healthier than rainy season.
8. Cold sweating in conjunction with acute fever indicates death.
9. The old have fewer illnesses than young but if any becomes
chronic, it carries them to the grave.
10. The very fat are more liable to sudden death than the thin.
11. Hardening of liver in cases of jaundice is a bad sign.
12. Old persons endure fasting most easily; next adults; young persons
not nearly so well; infants the least.
13. Both sleep and insomnolency, when immoderate, are bad.
14. In acute diseases, it is not quite safe to prognosticate either death
or recovery.
15. When sleep puts an end to delirium, it is a good symptom.
16. Persons who have frequent and severe attacks of swooning,
without any manifest cause, die suddenly.
17. Pains and fever occur rather at the formation of pus, than when it
is already formed.
18 | History of Medicine

18. In every movement of the body, whenever one begins to have


pain, it will be relieved by rest.
19. In those cases where there is sediment in the urine, there is calculus
in the bladder or kidneys.
20. In persons who cough up frothy blood, the discharge comes from
the lungs.
21. A woman does not take gout, unless her menses have stopped.
22. Eunuchs do not take gout, nor become bald.
23. In acute diseases, coldness of extremities is bad.
24. When bubbles settle on the surface of the urine, they indicate
diseases of the kidneys.
25. Those diseases which medicines do not cure, iron (the knife?)
cures; those which iron cannot cure, fire cures; and those which
fire cannot cure, are to be considered wholly incurable.

HIPPOCRATES’ FOUR HUMORS


At the time of Hippocrates most people believed that disease was sent
as punishment from the Gods. Treatment was aimed at pleasing the
Gods. Hippocrates went against this view. He believed that good health
depends on the balance of four humors present in the body: Blood,
phlegm, black bile and yellow bile. A disturbance in the balance of the
four humors was believed to be the cause of mental or physical illness.
According to him, the four humors gave the following characteristics to
the individual:
Blood gave a person a lively personality and lot of energy. The
person would enjoy the life and the arts.
Phlegm made a person feel lethargic and to have a dull personality.
Black bile caused mental depression and sadness.
Yellow bile influenced a person’s temperament. It caused anger and
fiery temper.
Ancient Medicine | 19

Treatment to restore the balance of humors consisted of blood


letting, emetics and purgatives.
The theory of four humors advocated by Hippocrates and accepted
by the physicians for centuries to come, was obviously wrong. However,
it was a turning point in the history of medicine, because, it said, disease
was caused by factors inside the body and not by the anger of Gods. A
close look at the contents of Hippocratic collection would reveal that
the physicians of those times had doubtful diagnostic or therapeutic
skills. Religious and cultural taboos forbade the dissection of human
bodies and therefore their concepts of human anatomy were often totally
at variance with the facts. Moreover treatment by herbal medicines was
usually not really effective. The knowledge of prognosis gained by
careful observations of many patients became a weapon of strength for
public recognition and assertion of supremacy as a physician.

ARISTOTLE
Aristotle (384-322 BC) (Fig. 1.12) is
another Greek famous for his biological
studies. One of his pupils was Alexander
the Great. Aristotle’s work on biology
remained the ultimate authority for many
centuries after his death. He dissected a
large number of animals and classified them
in genera and species. He distinguished
between vertebrates and invertebrates.
Besides biology, Aristotle wrote
extensively on philosophy, physics,
morals, ethics, and politics. He is said to
Fig. 1.12: Aristotle
have written 150 treatises on these topics.
20 | History of Medicine

Many of his descriptions of anatomy of various organs of animals


are still valid. For example, he was first to describe the chambered
stomach of ruminants. He gave detailed accounts of the structures in the
head, neck, thorax, and abdomen of the animals. He described the structure
and the function of the epiglottis. He mentions that the right kidney is
situated at a higher level in the abdomen than the left kidney.
Aristotle’s embryonic investigations made on chicks can be called
brilliant even by current standards. He could observe pulsating heart on
the third day of embryonic development in hen’s egg. Unfortunately,
this observation led him to a grossly wrong conclusion, that males are
not essential for reproduction. He opined that transfer of semen to the
female is only accidental and not necessary for procreation. Such ideas
probably formed the basis of the legends of virgin births in Christian
mythology. (In Hindu mythology, a male God is always implicated in
birth of a baby in an unmarried woman). Another misconception created
by him was the role of the brain. In his view, the brain only acted as an
agent for cooling the heart and preventing its overheating by secreting
phlegm (pituita).
Aristotle created the notion that certain “inferior” animals such as
insects came into the world spontaneously from decomposition of the
waste products and hence their growth could not be limited or restrained.
This concept was challenged only at the end of 17th century. Aristotle
elaborated a physiological system centered at the heart. The lung and
the brain merely functioned as cooling agents. The heart was the most
important organ. Life begins when it begins to beat (in the embryo) and
when the heart stops, the body dies. Thus, though many of the anatomic
observations made by Aristotle were correct; his concepts of physiology
of human body were widely off the mark.
Ancient Medicine | 21

CLAUDIUS GALEN
Claudius Galen (129-216 AD)
(Fig. 1.13) was a Greek who
became the most famous physician
in the Roman Empire. He was the
last among the ancient pioneers in
medicine. His medical brilliance,
his incisive anatomical studies and
physiological experiments, and the
sheer volume of his written works
overwhelmed his contemporaries.
Galen’s father, Nikon, was
Fig. 1.13: Claudius Galen
highly educated mathematician and
architect in Turkey. He seems to have inculcated in his son an abiding
love of mathematics, logic, and philosophy. Galen’s mother was a
shrewish wife known for her foul temper, shouting ceaselessly at Nikon,
and even biting her maids. Perhaps, wistfully, the warring parents gave
their son the name ‘Galen,’ meaning ‘peaceful’. Despite the name, in
adult life, the personality of Galen reflected, as much cantankerous
nature of his mother, as the philosophical nature of his father.
He was initially trained to be a philosopher, but he changed his
mind and became a physician. He studied medicine for a total of 12
years at the famous medical school in Alexandria in Egypt. On his return
to Greece, he became a surgeon at a school of Gladiators. During this
period, he gained experience in trauma and wound treatment. He regarded
wounds as “windows into the body”. Soon, he moved to Rome for the
sake of name and fame. He was successful in this endeavor and remained
the personal physician to three successive Roman Emperors. He was
popularly known as “The Prince of Physicians.” Thus greater part of
his life was spent in the Imperial court experimenting and writing. His
22 | History of Medicine

work was so extensive that he is said to have employed 20 scribes to


write down his investigations. It is believed that he wrote 500 treatises
on biology and medicine and various other subjects, such as philosophy
and philology. Amongst Galen’s major works is a 17 volume book, “On
the Usefulness of the Parts of Human Body.” The books on medicine and
biology continued to be used in various medical schools, all over the
world, for the next 15 hundred years.
Galen revived the methods advocated by Hippocrates about 500
years earlier, but he considered himself far superior to him. According to
him, Hippocrates only showed the path of modern medicine but could
not go far. He left the job for a successor (meaning Galen) to complete it.
Galen put great emphasis on the clinical observations. However, Galen
continued to follow the view that disease was the result of an imbalance
between the four body humors. Galen believed in the use of the
opposites—if a patient suffered from fever, he treated with something
cold; if he appeared cold, he would treat with heat.
Galen was the originator of the science of experimentation in biology
and medicine. Since human dissection was not acceptable at that time,
he dissected thousands of animals to understand the working of human
body. He experimented chiefly on the Ape, which he considered to be
nearest to humans in structure and function. He also used pigs and dogs.
That some times led to gross errors in the concepts about the anatomy
of human body.
Most of the Galen’s observations in anatomy and physiology were
accurate. Galen was the first to cut the spinal cord of living animals at
different levels. From these systemic studies, he was able to demonstrate
the exact function of the diaphragm and the chest muscles in the
mechanism of breathing. In another experiment, he tied the ureters and
observed swelling of the kidneys. He concluded that the urine is formed
in the kidneys and not in urinary bladder as was believed at that time. To
study the function of the nerves, he cut them one by one, and observed
Ancient Medicine | 23

which muscle became paralyzed. He could correctly identify seven of


the 12 cranial nerves, as well as the recurrent laryngeal nerve. He also
discovered the valves of the heart. He recognized contagiousness of
tuberculosis and possible spread of rabies by dogs.
Galen’s most revolutionary discovery was that the heart and the
arteries contained blood, not air (pneuma) as believed at that time. For
this purpose, he devised an ingenious experiment. Galen exposed the
heart of a living animal and inserted a fine tube through the wall of the
heart into the left ventricle. He showed that, with each heartbeat bright
red blood, not pneuma, spurted out of the tube.
Galen’s reputation as a researcher is, however, marred by some
funny concepts presented by him. He believed that food material from
the intestine is carried to the liver and converted into blood which enters
the vena cava and passed up and down the body. And that, blood is
burnt by the tissues like wood consumed by fire. Probably his most
erroneous was the belief that the heart was divided into two parts.
Blood from the right side of the heart went to the liver and the lungs,
whereas blood from the left side of the heart went to the rest of the body
and flowed back to the two halves respectively. The two types of blood
were kept separate and only a small amount of blood passed, from one
side to another, by leaking through the septum of the heart. This false
concept of flow of blood in the body was refuted only fifteen hundred
years later by Vesalius, an anatomist, and Harvey, a physiologist.
In spite of some mistakes and misconceptions created by Galen, the
wealth of accurate details in his writings is astonishing. Galen was one
of the most high-profile personalities in Rome of that era. In addition to
his well-publicized cures, his stream of books and his public anatomical
demonstrations, he was known to publically criticize and ridicule other
physicians. He became as boastful and combative as the gladiators he
once treated. Galen’s tongue and pen were as sharp as his scalpel, and
led to a number of bitter public disputes.
24 | History of Medicine

Examination of the pulse is one of the most lasting techniques


advocated by him. It is fascinating to know that it was two thousand
years ago that Galen wrote a monograph on the examination of the pulse
and mentioned many of the features of the pulse as well as some of its
abnormalities. For example, he wrote that the heart and all the arteries
pulsate with the same rhythm. Pulse can be examined at various places,
but it is best examined at the wrist, for here it is easily visible, there is
little flesh over it; it is not necessary to strip any part of the body, and
it runs a straight course. He further wrote that normally the artery will
seem to be distended in every dimension. In disease, the pulse may be
too fast or too slow; its strength may be too low in some patients or too
high in others. The pauses between the pulses should normally be of
similar duration (regular pulse). Irregularity of the pulse indicates serious
heart disease. He does not fail to advise that to detect any abnormality;
the physician should examine the pulse of a good number of normal
persons.
Galen had an ego no one could match. “I have made all the medical
discoveries. There is no need of any further research in medicine,” he
declared. It is ironic that the ancient world’s strongest exponent of
experimentation for medical discoveries forbade others from following
his example. For the next 1500 years, Galen was regarded as the ultimate
authority in medical science. Any one who did not agree with Galen’s
writings was not considered worthy of being called a physician.
26 | History of Medicine

THE DARK AGES IN WESTERN MEDICINE


After the fall of Roman Empire by the end of 2nd century, a new period,
referred to as the “Middle Ages” began. For next one thousand years,
the populations of Europe suffered grief, sorrow, torture, invasions and
plague. After the division of the old Roman territory, many problems
began to emerge. The Europe was divided into several kingdoms, each
declaring war on the other. Barbaric invasions were common. Peasants
were at the mercy of invaders. Castles, built in years to protect a given
population, were destroyed in weeks of siege. Crime was very common.
Punishment was severe. Even theft of a piece of bread could lead to
imprisonment in torture chambers. Travel, even in groups was not safe.
Women were particularly maltreated. Heresy—disagreeing with Church
in any way, was punishable with death. The charge of heresy led to
death of thousands of innocent men and women. In the 11th century
began the crusades— invasions of the Moslem territories of the Europe
and Middle-East by Christian mobs, which destroyed everything that
came in their way.
Ancient Greeks and Romans pushed forward the knowledge of
medicine but the trend reversed with the fall of Roman Empire. These
next ten centuries were characterized by wars, conflicts and poverty. In
the history of Western medicine, the period between 200AD and
1500AD is considered the dark ages. During these centuries, learning of
all branches of art and science including medicine came to stand still.
Libraries and universities were destroyed in Europe by the Christian
crusading mobs, since they belonged to the “pagans” (Non-Christians).
Another major factor for the deterioration in medicine was the rise
of Christianity in Europe. The Church-fathers became the custodians of
medical tradition. Sickness was attributed to be a punishment from God
for the sins committed. So, according to priests the patient could be
cured by seeking forgiveness from God, not by herbs and potions. The
Medieval Medicine |27

traditional medical remedies were considered witchcraft and outlawed


by the Church. It was firmly believed that an illness could be cured by
charms and spells or going on a pilgrimage or even by a touch by the
King or the Queen. The treatment was also determined by the zodiac
sign under which the patient was born. Based on the zodiac sign of the
patient, following is the list of instructions for the “doctor” of those
times.
Aries: Avoid incisions on the head and face and cut no veins.
Taurus: Avoid incision in the neck and cut no veins there.
Gemini: Avoid incisions on the shoulders, arms or hands and
cut no vein there.
Cancer: Avoid incisions in the breasts, stomach and lungs.
Leo: Avoid incisions on the nerves, bones, or the back.
Virgo: Avoid opening a wound in the belly or in internal parts.
Libra: Avoid opening wounds in the umbilicus, belly or back.
Scorpio: Avoid cutting the testicles or the anus.
Sagittarius: Avoid incisions in the thighs and fingers.
Capricorn: Avoid cutting the knees or veins in these places.
Aquarius: Avoid cutting the knees or veins in these places.
Pisces: Avoid cutting the feet.
In the dark ages, saints also became adjuvant of treatment. They or
their relics were believed to have some miraculous powers. Even wars
were fought over some of these relics. The concept of Cult of Saints
was particularly widespread in Europe. The most famous of such medical
saints were Saints Cosmos and Damian. A posthumous miracle was
attributed to them: they are said to have transplanted a leg taken from
a healthy black slave to a white devotee in a church who had developed
a gangrenous leg (Fig. 2.1).
Some of these saints became almost “specialists.” St. Lucy “cured”
eye diseases, St. Apollonia, tooth problems, St. Flacre, hemorrhoids,
28 | History of Medicine

St. Anthony, leprosy, and St. Roch, plague.


Patients traveled long distances for the sake
of these miraculous cures.
In the medieval period even the so-
called qualified physicians had forgotten
the work of Hippocrates, Aristotle and
Galen. The diagnosis and treatment of a
patient revolved around the doctrine of
four body humors. But, the “diagnosis”
was mostly based on the smell and color of
urine kept in a flask and smell and color of
phlegm. Cloudiness of upper layers of the
urine was taken to indicate that the origin Fig. 2.1: Transplant leg
of the disease was in the head. Cloudiness
at lower levels was taken as indication of a disease involving urinary
bladder or the genitalia. The most common form of treatment was purging
or blood-letting. These procedures were believed to get rid of poisons or
excess of body fluids which were producing a disturbance in the balance
of the four body humors. Most elaborate directions were given regarding
the day and time of bloodletting as well as the vein to be bled. When
only a small amount of blood was to be removed, a leech was used.

BARBER-SURGEONS (FIG. 2.2)


In the medieval period in Europe barbers (also called barber-surgeons)
became a new breed of “doctors” who conducted surgery on the patients.
In those days, physicians did not perform surgical treatment. This job
was left to barbers who often had no education but learnt the art by
apprenticeship to senior barbers. Gradually, these barber-surgeons became
important members of the society, especially because their charges were
far less than the physicians. However their social status was nowhere
Medieval Medicine |29

near that of a qualified “doctor.” Even in the 18th century, the lay press
depicted a physician as a clean bewigged and perfumed individual (Fig.
2.3), whereas a surgeon was likened to a butcher. The barber-surgeons
performed surgeries like blood-letting, pulling teeth, amputations or
bladder stones or treating abscesses and ulcers. In the absence of
anesthesia, surgery was performed on conscious struggling patient
physically restrained by some attendants. Since surgery was performed
only occasionally, hair cutting and shaving provided the day-to-day
source of income. The situation is best epitomized by the amalgamation
of the Company of Barbers and Surgeons in London in 1540. Even up
to 19th century; surgery was undertaken by itinerant (traveling) surgeons
who were specialized in one art, e.g. tooth extraction, removal of cataract
or a bladder stone. The surgery on hernia done by such surgeons often
resulted in amputation of the penis as well.

Fig. 2.2: A barber-surgeon Fig. 2.3: A physician


in action
30 | History of Medicine

Hospitals
The earliest documentary evidence of a hospital is in Sri Lanka in 4th
century BC. In India, King Ashoka (230 BC) is reported to have founded
18 hospitals, each with many physicians and nurses. In Europe, the
medical treatment was mostly carried out in the patient’s home. The
hospitals are a medieval invention. Thus, in the dark ages of medicine,
one development may be considered to have a positive and permanent
effect on the profession. Initially, the hospitals were not meant to be
used as understood today. The word ‘hospital’ is derived from the Latin
“horpes” which means a stranger, a foreigner or a guest. The original
function of a “hospital” was to provide hospitality and shelter for the
travelers of all kinds. Since these institutions were run by the Church,
gradually they began to cater exclusively to the poor, the aged, and the
sick. In Christianity, care of the sick is placed above any other duty, as
if the Christ is being directly served by serving the sick and the poor.
These institutions were looked after by monks or nuns who offered
some general nursing help, but there were no physicians or surgeons. By
and large, the hospitals were overcrowded and dirty. Within many
hospitals, a strict rule of life, which was almost monastic in character,
involving vows of poverty, chastity and obedience, was upheld. Any
inmate who inherited a property worth more than four pounds a year
had to leave the hospital. The inmates had to attend daily services in the
church, sometimes several times a day. By thirteenth century, the
institutions began to cater exclusively to the patients. They began to be
manned by physicians though remained under the overall authority of
the Church.
Leprosy was one of the most feared diseases in the medieval period.
Victims of the disease became outcasts of the society. They were
condemned to become beggars, warning people of their approach by
ringing of hand-bells. Some hospitals catered exclusively to lepers. These
were isolated from the towns or cities.
Medieval Medicine |31

THE BLACK DEATH


Plague is the biggest medical disaster of medieval medicine. When a
trading ship docked in a sea port in Sicily in 1347, it brought a deadly
cargo. It carried on board only a few diseased sailors alive. All others
were already dead from a new disease which Europe had never seen
before. The diseased sailors showed strange black swellings the size of
an egg in the armpits and groin. The swellings oozed pus and blood.
Boils and black blotches were seen all over the skin. That is why it came
to be known as the Black Death. All the victims of the ship died within
five days, but by that time, the disease had begun to spread with terrifying
speed. Within two years, it had reached all over Europe. The disease
killed people with such a speed that an Italian writer said, “The victims
often ate lunch with their friends and dinner with their ancestors in the
heaven.” People fled in panic. They left closest relatives to their fate to
die unattended. Within five years, more than one-third of the population
of Europe was wiped out by the disease.
All the efforts to control the epidemic were fruitless because no one
knew its cause. Most, however, realized that the disease spread by
contact to the sufferer or even his clothing. Some said it could spread
even by the stare of the victim. So they covered themselves with thick
clothing and held a cloth to their nose. Some, especially doctors, wore
elaborate masks shaped like a bird’s head which had a holder for burning
incense in the bead (Fig. 2.4). As the deaths mounted, people turned to
God for help. The priests declared it as a punishment for their sins. So,
many donated all they possessed to the Church hoping uselessly that
the good deed would shield them from the disease. Others resorted to
lucky charms, or images of Christ, perfumes or vinegar. The medical
faculty of the University of Paris blamed alignment of planets for the
plague. But, as expected, nothing worked. The situation was further
aggravated by the Pope who declared 1350 to be a holy year. Pilgrims
32 | History of Medicine

were advised to visit Rome if they


wanted direct access to heaven.
Thousands traveled to Rome
spreading the disease far and wide.
The “Black Death” recurred four
times in Europe, before the end of the
century. The effects of the “Black
Death” could be seen for centuries to
come. Besides many socioeconomic
effects, it brought a change in the
attitude of people towards religion and
medical profession. Since religious
practices could not save them from
the Black Death, people’s confidence Fig. 2.4: Doctor’s attire
during “Black Death”
in the Church dwindled. More
importantly, they lost faith in the medical practices prevalent at that
time. This opened the gates for greater intellectual freedom and growth.
The doctors began to do something they had not done for over 1200
years. They began to question the ancient medicine. They wanted to
investigate themselves rather than blindly believe the ancient teachings.
This was one of the factors which laid the foundation of Renaissance in
the Europe.

THE GOLDEN ERA OF ARABIC MEDICINE


With the collapse of Roman Empire, the works of Hippocrates, Aristotle
and Galen were lost in Europe. These works lived on in the East as
Arabic translations and formed the basis of Arabic medicine. The rise
and fall of Arabic medicine coincides with the rise and fall of Islamic
culture in the Arab world.
Medieval Medicine |33

In 750 AD the Caliphate was shifted from Damascus to Baghdad. A


new dynasty of rulers gave less attention to the military adventures, and
focused instead on learning, culture, and intellectual life. This quickly
made the court of Baghdad the most wealthy and tolerant in the world.
The Caliphs of Baghdad acquired a reputation as patrons of men of
letters and science. The rich intellectual life in the Islamic World at this
time also led to independent research and new inventions. This ‘Golden
Age’ was a period of broad tolerance and intellectual curiosity with the
translation of many Greek works into Arabic in 750-850 AD. The
progress in medicine was somewhat impeded by the prohibition of
dissection, as well as, by the deep respect for the writings of Hippocrates
and Galen. Independent Arabic medical literature started to appear the
9th century, and reached its prime in the 11th century AD.
From the seventh century onwards, the Caliphs of Baghdad and
Cordova encouraged education in their subjects, to the extent that almost
all adults could read and write. The Arabian physicians were thoroughly
conversant with the works of Galen and other ancient Greek physicians.
They also showed a strong influence of Indian and other Eastern systems
of medicine. Several Indian professors of medicine were residents of
Baghdad during the reign of Caliph Harun al-Rashid. The Hakims of
Muslim world knew more than a thousand drugs, many of which were
of chemical origin. They could induce anesthesia, perform Cesarean
section, tracheotomy, and complicated eye surgery. They operated on
head fractures, performed dental surgery, including insertion of false
teeth made from bones of animals. In the medical literature of Arabic
medicine, there is vivid description of diseases like measles, smallpox,
meningitis, hay fever, and syphilis.
Believing in the Hippocrates’ four humors of the body, the Arab
physicians searched for the causes of a disruption of the harmony of the
humors among the six factors: Air, food, body rest and movement,
sleep, emotional rest, excretion and retention. The medical practitioners
34 | History of Medicine

were able to distinguish a large number of diseases, some of which they


had learnt from ancient medical sources; and others, like smallpox,
meningitis, whooping cough and hay fever, which they were first to
describe in detail.
Abu Bakr Al-Razi, known to the West as Rhazes (860-940 AD),
was born in Tehran (Iran). By the age of thirty, he shifted to Baghdad,
where he emerged as the leading physician of his time. There, he became
the director of the first hospital in the Islamic world. His services were
available as easily to a cobbler or a camel-driver as to a prince or an
aristocrat. Early in his medical studies, al-Razi began to compare what
earlier writers had said with his own observations and experiments. If
his observations or interventions contradicted or superceded what other
writers like Hippocrates or Galen had written, he had greater faith in
himself. Towards the end of his life, al Razi began to edit the vast
compilations of ancient medicine, incorporating his own observations.
The massive manuscript, called Kitab al havi fil-tibb (Comprehensive
Book on Medicine), contained almost all that was known about medicine
by that time. He also wrote a ten-volume book Kitab al-Mansuri (The
Book of Mansur) on anatomy, physiology, hygiene, diet, pathology,
diagnosis, treatment, and surgery. Al-Razi’s most remarkable work,
however, is a much smaller one, on smallpox and measles. Al-Razi
presented the first and detailed description of the signs, and symptoms
of the smallpox, contrasting it with those of measles and other rashes.
He traces the course of the disease and its possible outcomes, including
scarring, blindness, and death.
Al-Razi described a type of unilateral headache highly comparable
with migraine as is known today. He also describes the some other
types of headache due to encephalitis or brain tumors or obstruction in
the passages in the brain.
Al-Razi’s medical acumen and writings on medicine won him fame
throughout the Arabic speaking world, and later throughout Europe.
Medieval Medicine |35

But, his egalitarian, antiauthority and antireligious writings provoked


the enmity of emirs, imams, and most of his other fellow philosophers.
Ultimately, he had to flee Baghdad. Like many other great men of medicine,
al-Razi died blind, battered and embittered. Al-Razi is reported to have
told a surgeon who offered to remove his cataract, “I have seen enough
of the world and have no desire to see it further.”
Another Arabic physician who
deserves special mention with regards
to internal medicine is Abu Ali ibn
Sina (937-1037 AD) (Fig. 2.5), the
“Prince of Physicians.” He was
known in the West as Avicenna. He
wrote the famous medical book “Al-
Qanun fil-tibb (The Canon of
Medicine), which was the codification
of all the Arabic medicine. This book
formed half of the medical curriculum
of European universities in the later
part of 15th century. Fig. 2.5: Abu Ali ibn Sina
Islamic medicine was more
concerned with the prevention of illness than with its cure. Western
travelers through the Islamic world were struck with the general
cleanliness of the people. Diet and baths were important tools among
the Hakim’s weapons against disease. A number of chemical processes
such as distillation, sublimation, and filtration were known and commonly
used by Islamic scientists in the preparation of medicines.
An Arab physician, Ibn al Nafis, who worked in Cairo, is believed
to have described the pulmonary circulation and circulation of blood in
the body in the 12th century. This fact was “rediscovered” centuries
later by William Harvey. According to the Arabic scholars, Ibn al-Nasif,
not Harvey, deserves credit for the discovery of circulation of blood.
36 | History of Medicine

The dark ages for the Arab medicine began in the 13th century, with
the invasion of Baghdad by the Mongols, from which they never
recovered. Probably the most important contribution of Arabian medicine
to the Western medicine is the fact that they retained the works of
Ancient Greek pioneers in medicine as Arabic translations. The Arabic
literature was retranslated and used by the Europeans during renaissance.
38 | History of Medicine

RENAISSANCE OF MEDICINE
Renaissance is a French word meaning rebirth. The period between the
middle of 14th and the middle of 18th century is known for Renaissance
of art, culture, science, and medicine in Europe. It describes a great
change in European way of thinking in the period mentioned above. Five
main reasons have been cited for the change. They include:
1. Invention of the printing press: New ideas could spread far and wide
quickly.
2. The revival of classical learning: Since the fall of Roman Empire,
the rot in social fabric is known as the Dark Ages. By the middle of
14th century, there was a renewed interest in ancient Roman and
Greek heritage. The studies of these ancient manuscripts revived
the earlier endeavor to observe nature closely and think of
explanations for the phenomena observed, independent of what the
Church said. The Church was criticized for the first time for the
social evils in the society. Particularly after the Black Death, there a
strong feeling that both religion and the prevalent medical practices
had failed to protect the people against the dreadful disease.
3. Many universities were gradually established: They were
independent of religious control. Intellectuals working there had
individualistic mental make-up and, therefore, new artistic and
scientific ideas flourished.
4. Voyages of exploration: Greater travel contact between different
societies brought new thoughts, ideas, and attitudes.
5. Gun powder: This factor is especially relevant to renais-sance in
medicine. The invention of gun powder made wars bloodier. The
army surgeons saw types of injuries never seen in civilians before.
New ideas were required to deal with such injuries.
There is no set starting point or time when renaissance can be said
to have begun. It happened at different places and different times. The
14th to 18th Century Medicine |39

term seems to convey an impression that the period of Renaissance was


a golden period for the European population. Actually this term is
applicable to a minute number of artists and scientists of Europe who
began to work in an environment of intellectual freedom. The condition
of general population was as bad as or rather worse than in the Dark
Ages (medieval period). Many historians have pointed out that the
period known as renaissance was actually the age of Machiavelli; the
Wars of Religion; corrupt Popes and intensified witch hunts. It was a
period of further decline in human values than seen in the middle ages.
From the medical point of view, this period was no way better than
the Middle Ages. The basis of treatment of a disease was still the
disturbance of four humors. Visual examination and taste of urine was
still the ultimate diagnostic investigation. Treatment still consisted of
leaches and blood letting. Most of the surgeons actually ended up harming
more often their patients than not. Surgery, like amputation, was done
without a pain killer. Therefore, during such procedures, the patient was
restrained like animal. The surgical instruments were crude and unsterile,
which caused serious infections. If any one became ill, there was little
chance of recovery. People seldom reached the age of 50 years. Medical
catastrophes were even worse than seen in the Middle Ages. During
renaissance, trade routes were the perfect means of spreading disease to
new populations.

BLOODLETTING AND PURGATIVES—PANACEA


FOR ALL INFLAMMATIONS
Even up to 18th century, bloodletting was a popular remedy, especially
in France (Fig. 3.1). It was widely believed that most of the diseases
were caused by “inflammations.” The poisonous material was believed
to be removed by bloodletting through leeches or venesection.
Bloodletting was prescribed to patients suffering from any ailment.
40 | History of Medicine

Francois Brousseau was an assistant


professor in a military medical school
and a famous physician of Paris. He
was the leader of the advocates of
bloodletting. During the hospital rounds,
his treatment of various disorders varied
only in the number of leeches
“prescribed.” In 1833, over 41 million
leeches were imported into France.
Leeches became so expensive that only
the rich could afford this “treatment.”
Venesection was a cheaper method of
bloodletting. A patient was expected to
be bled till syncope supervened. In fact,
bloodletting was considered even a Fig. 3.1: Bloodletting
tonic. Scores of peasants would go to a
physician to be bled (to get rid of “poisons’ in their body). It is believed
that during his last illness in 1799, George Washington, the first president
of the USA, suffering from peritonsillar abscess, was bled to such a
degree that he died. One may say that he rather died from treatment than
from disease.
Getting the bowel open was another popular remedy for removing
“poisons” from the body. The traditional pharmacopeias chiefly
consisted of drugs which acted as laxatives and purgatives. The following
account of a famous physician of London is the best example of the
treatment available in the 19th century: “I was called on to examine
Mrs. ________. Her fingers were painful, swollen and thickly studded
with eruptions from which oozed a semitransparent fluid. As I suspected,
the patient confessed to have neglected the bowels. I prescribed her a
tepid bath and a mild laxative.” Whether the patient was cured or not by
the treatment is not mentioned by the physician. Often dangerous
14th to 18th Century Medicine |41

substances were used to get the bowel


open or the bladder peeing. Mercurous
chloride appeared in the bag of every
physician.
Herman Boerhaave (1668-1738)
(Fig. 3.2) was one of the most influential
of the clinicians and teachers of 18th
century. Boerhaave spent his entire life
in Leiden (Netherland), which became a
leading medical center of Europe. He was
so famous all over the world that a letter
sent from China addressed only as
Fig. 3.2: Herman Boerhaave
“Boerhaave, the greatest physician in
the world,” was delivered to him. When he died, he left behind an elegant
volume, the title page of which declared that it contained all the secrets
of medicine. On opening the volume, every page, except the last one,
was blank. On that page was written, “Keep the head cool, the feet warm
and the bowels open.” This advice sums up all the art and science of
medicine known in the 18th century.

18TH CENTURY EUROPE—AN


EXPORT HOUSE OF DISEASES
Till late 18th century, a European was a store house of diseases. The
reason for such a statement would be clear from the following description
of the way of living of Europeans: “Most people, including the royalty,
did not bathe more than once or twice a year. Clothes were not washed
even for the kings and queens. When they became extremely dirty, they
were replaced by new ones. Old ones were then used as such by the
servants. King Henry IV of France had to use specially made strong
42 | History of Medicine

perfumes to cover up his dreadful body odors. Hence, in those times,


human body was a literally a nest of lice and fleas. Human waste was
flung into the streets to mingle with that of dogs and horses. Rats and
other vermin burrowed their way through houses, shops and traverns.
The corpses of dogs, cats and even horses were usually left in the
streets.”
Initially, because of the lack of fast means of transport, populations
of different continents were practically isolated from each other. When
the Europeans started long voyages and became traders and colonizers,
they also “exported” a variety of new diseases to whichever country
they visited. The natives of these countries had never been exposed to
these diseases and therefore had no natural immunity.
Syphilis was brought by Columbus and his sea men, from America
to Europe in 1493. Within a few decades, it spread all over Europe. At
one time, about one-third population of Europe is believed to have
suffered from syphilis, (called big pox in those times). Many Kings of
England and France, famous artists and musicians of that era became
infected with the disease and ended as insane in later life, due to the
involvement of the brain in the syphilitic process. Interestingly, syphilis
was called the “French disease” in England and the “English disease”in
France. Vasco de Gama led not only the Portuguese, but also syphilis to
the Indian subcontinent, from where it spread as far as China and Japan.
Migration of Europeans to Americas brought diseases like influenza,
bubonic plague, malaria, typhoid fever, cholera, yellow fever, amoebic
dysentery etc. to the New World. Since the natives had no natural
immunity against such diseases, they spread like hurricanes and wiped
out over 90 percent of the indigenous population. To overcome the
labor shortage, the American whites brought a large number of slaves
from Africa, who brought with them a number of “African diseases” like
14th to 18th Century Medicine |43

yellow fever, and sickle cell anemia. Thus there was an abrupt linkage of
disease environments of Europe, America, and Africa, with devastating
waves of one epidemic after another.

UROSCOPY—THE ULTIMATE
DIAGNOSTIC INVESTIGATION
Urine, now, is known to be merely a waste product. But throughout the
medieval and renaissance period, it was considered a divine fluid; a
window to the body and soul of a person. Hence its examination was
believed to be of great diagnostic and prognostic significance.
By the Middle Ages, the Catholic Church had emerged as the most
powerful force in Europe. The church prohibited physicians from
touching certain parts of the body or even seeing a patient unclothed.
Since urine could be passed on to the doctor from behind a screen, its
examination became the only method of diagnosis of a patient, especially
a female. In seventh century, a very popular book on uroscopy was
written by a Byzantine physician, Theophilus. The book called “ON
URINES” was translated throughout Europe. About 300 years later, an
influential Arab physician Isaac Judaeus enlarged the study to such an
extent that he claimed to diagnose all known diseases by urine
examination. The book described about 20 hues of urine. The medical
practice was made so simple that the book was an instant hit. Since
gazing into a pot of urine was problematic, they developed the matula—
a round bottomed flask made of clear glass. The idea behind a matula
was that since it had the capacity of a urinary bladder, urine would be
seen in “true colors.” In fact, a doctor holding a urine filled flask against
light became a typical image of medicine and later on of chemistry (Fig.
3.3). The flask allowed the examination of all the characteristics of urine
44 | History of Medicine

considered essential for the diagnosis viz.


color, clarity, thickness, sediment, odor, and
foam. The final step was testing its taste. If
in a patient with excessive urine production,
the urine tasted sweet, the diagnosis was
diabetes mellitus (mellitus means sweet);
if it was tasteless (insipid) the diagnosis
was diabetes insipidus. Fortunately,
nowadays, alternative laboratory methods
are available to make the differential
diagnosis.
Fig. 3.3: Uroscopy

ANDREAS VESALIUS
Andreas Vesalius (1514-1564) (Fig. 3.4)
is considered the founder of modern human
anatomy. Actually he was the forerunner
of the few investigators responsible for the
trend now known as the renaissance of
medical science.
Vesalius was a descendant of a German
family of court physicians. As a medical
student, he attended anatomy lectures in
University of Paris. The lecturer read from
a book of anatomy written by Galen more
than 1000 years earlier, while his assistant
Fig. 3.4: Andreas
dissected a corpse. Very often the lecturer
Vesalius
could not find an organ as described by
Galen. In such cases, the corpse rather than Galen was held responsible
for the error. During this period, Vesalius felt the necessity of independent
personal investigations into human anatomy. He would very often be
14th to 18th Century Medicine |45

seen examining human bones at a cemetery or a corpse at the site of a


public execution.
On graduation in 1537, he was offered a chair of surgery and anatomy
at the famous University at Padua (Italy). In that capacity; he started an
unheard of practice of personally dissecting the dead bodies. (The “lowly
job” was used to be done by an assistant or a barber-surgeon). His
lectures were based on his personal observations, instead of a thousand
year old book. In 1539, the supply of dissecting material increased,
when a Paduan judge became interested in his work and made bodies of
executed criminals available to him. For more serious offences, the
dissection was started even before the criminal was actually dead. Vesalius
kept meticulous records of the dissections undertaken by him. Finally,
in 1543, when he was only 30 years old,
he published a seven volume treatise on
human anatomy entitled De humani
corporis fabrica (On the fabric of the
human anatomy) (Fig. 3.5). His book
showed glaring discrepancies in the
description of human anatomy given by
Galen, which were actually based on
dissection on apes and other animals. The
book was an immediate success, but it
enraged the traditionalists who followed
Galen. Even the faculty of Padua, where
he worked, turned against him and called
him a mad man. They argued that “any
advance beyond the knowledge of Galen
was impossible. Anything different
observed by Vesalius was probably
because of changes in the human body Fig. 3.5: Human
since the times of Galen.” In frustration, musculature by Vesalius
46 | History of Medicine

soon after the publication of his book, Vesalius resigned from the
university and became a physician in the imperial court. For over 12
years, he traveled with the army, treating the injured in the battles,
performing surgeries as well as postmortems.
In 1564, Vesalius went on a pilgrimage to the Holy Land (Jerusalem).
On the way back, his ship was wrecked on an Island, where he died. Had
a benefactor not paid for his funeral, his remains would have been
thrown to the animals.
Modern medicine is forever in debt to the efforts of Vesalius for the
most accurate description of the anatomy of human body. Still more
significant was the fact that he dared to challenge the thousand-year-old
concepts of human anatomy given by Galen. By overthrowing the
traditional teachings, relying on his own observations, and his ability to
correct his own claims found incorrect, Vesalius started a new trend in
medical science. Through his attention to detail and unprecedented quality
of anatomical drawings, he set a new standard for the future medical
textbooks.
Vesalius made new discoveries and recorded the inaccuracies of
Galen’s work in almost each aspect of human anatomy. His detailed
study of human skeleton revealed that the human mandible was a single
bone (not two), the human sternum consisted of three parts (not five),
and disproved the common belief that men had one rib fewer than
women. On his dissections on the human heart, Vesalius confidently
refuted Galen’s claim of a porous interventricular septum. He discovered
and named the mitral valve in the heart. In the study on the brain,
Vesalius’s most significant contribution was his excellent illustrations
which, for the first time, depicted the corpus callosum, the thalamus,
the caudate nucleus, the lenticular nucleus, the globus pallidus, the
putamen, pulvinar and cerebral peduncles. Vesalius also discovered that
the human liver consisted of two lobes (not five, as claimed by Galen),
described the pylorus, and commented on the small size of appendix in
14th to 18th Century Medicine |47

humans. Actually, the anatomical discoveries of the anatomy of human


body made by him are so many that one is amazed at the single-handed
effort put in by Vesalius.

LEONARDO DA VINCI
Leonardo da Vinci (1452-1519) (da
Vinci means “from Vinci”) (Fig. 3.6),
most famous for his paintings like
Mona Lisa and The last Supper, is
considered the most important artist
that ever lived. He has been described
as the archetype of the “Renaissance
man,” a universal genius, a man
infinitely curious and infinitely inven-
tive. Besides an artist, he was an
architect, musician, inventor, engineer
and a geologist.
Leonardo was an illegitimate son
of an Italian lawyer. Even in his school Fig. 3.6: Leonardo Da Vinci
days, his sketches were of such high quality that his father made him an
apprentice of a renowned painter at the age of fourteen. Over a period of
few years, his paintings surpassed the quality of his master.
In later part of his life he assumed many other roles including civil
engineer and architect (designing mechanical structures such as bridges
and aqueducts), military planner and weapon designer (designing
rudimentary tanks and catapults). Whatever he undertook, Leonard
always followed the true method of scientific enquiry: close observation
repeated testing of the observation and precise illustration of the object
or the phenomenon. According to Sigmund Freud, the famous
psychologist, Leonardo was like a man who woke up too early in the
darkness while others were still asleep.
48 | History of Medicine

Leonardo was a man of high integrity and very sensitive to moral


values. His respect for life led him to become a vegetarian to the extent
that he considered taking cows milk amounts to stealing it from the calf.
It is said that as a young man he often bought caged birds just to release
them from captivity.
Leonardo developed interest in human anatomy when was an
apprentice painter. His teacher insisted that the knowledge of human
anatomy was essential for painting life-like images of humans. Initially,
he would visit grave sites late at night and dig up freshly deceased
corpses. Later, when he became an independent artist, he obtained
permission to dissect human corpses at various hospitals in Italy. In
this work, he received collaboration from some doctors as well. In all
Leonardo dissected 30 male and female corpses of different ages and
prepared more than 200 anatomical drawings (Fig. 3.7).
Leonardo drew many images of the human skeleton and was first to
describe the double S form of the backbone. He drew many images of the
lungs, mesentery, urinary tract, sex organs and even coitus (Fig. 3.8).

Fig. 3.7: Body proportions of Fig. 3.8: Human skeleton by


man by Leonardo Leonardo
14th to 18th Century Medicine |49

One thing that strikes the most in Leonardo’s anatomic drawings is


the perfect harmony between artistic talent and technical precision.
Leonardo’s drawings are considered as the most artistic anatomical
drawings of the Renaissance. Leonardo may not be the best anatomist of
his times, but he certainly was the first man who introduced the practice
of making anatomical drawing. Leonardo was quoted as saying: “Do not
busy yourself with things belonging to the eyes by making them enter
through the ears.” The quote probably refers to the practice of anatomy
professor’s habit of sitting on a chair and read the book written by
Galen.
Da Vinci kept a record of his observations in notebooks. The recorded
work comprises 13,000 pages of notes and drawings. Probably to maintain
secrecy of his studies, he used mirror-image writing throughout his life.
Leonardo’s anatomical drawings of human body were recorded many
decades before those published by Andreas Vesalius, but they made no
contribution to the development of medical science in the renaissance.
The reason was that all the papers of Leonardo da Vinci were kept
hidden and were discovered 200 years later by William Hunter and
published only in 1898-1916.

PARACELSUS
Theophrastus Philipus Aureoles Bombastus von Hohenheim
(Paracelsus) (1493-1541) (Fig. 3.9) was the most colorful and
controversial figure in renaissance period of Western medicine. On one
hand, he set out to smash out the centuries old theories of four body
humors and treatment by bleeding and purging. On the other hand, he
developed his own brand of medicine, which believed in the role of God,
angels, and devils, as well as, amulets and charms.
In his youth, he attended a number of well-known universities in
Europe one after another, but left dissatisfied. He wondered, “how the
50 | History of Medicine

high colleges managed to produce so


many high asses?” It seems Paracelsus
obtained no formal education in
medicine. He claimed to have gained the
knowledge by traveling. “A doctor must
be a traveler. He must seek out knowledge
from old wives, gypsies, sorcerers, and
robbers,” declared Paracelsus. He learnt
metallurgy and alchemy from his father.
Theophrastus published his first
book on medicine under the name
“Paracelsus,” meaning greater than
Celsus, the great encylopedists of the Fig. 3.9: Paracelsus
1st century. He meant to announce to
the world that his medicine was better than that of ancient Greece and
Rome. His treatments involved a combination of careful observation,
experimentation and magic. He had greater faith in nature’s power of
healing than that of his fellow doctors. He was ready to try anything.
Soldiers’ lore said that wounds healed better if you apply the dressing to
the sword or spear that caused the wound than to the wound itself.
Paracelsus tried it and found that it worked (!)——wounds healed better
if they were not treated with the traditional ointment. “If you prevent
infection,” he concluded, “Nature will heal the wound all by itself.”
In 1526, Paracelsus was called upon to treat Johannes Froben, an
influential person of Basel, Switzerland. Froben was gravely ill from a
chronic infection of his right leg. Paracelsus started a comprehensive
treatment and saved the leg. Forben told the world that Paracelsus was
beyond compare and got him appointed city physician with the
responsibility to teach medicine at the local university. Paracelsus got
everything he could wish for—fame, plenty of patients, a home base,
where he could carry out his alchemical experiments and write. He also
14th to 18th Century Medicine |51

got a forum from which he could teach his radical new ideas to the
medical students. But as was his character, it took him mere eight months
to destroy all this. Soon after appointment, Paracelsus antagonized all
the medical fraternity of Basel by publishing a pamphlet, which among
other things, stated that “most doctors today make terrible mistakes
and harm their patients, because they cling to the teachings of
Hippocrates, Galen, and Avicenna. Their prescriptions were not just
misguided but also useless, contaminated, dangerous, and overpriced.
With a ceaseless toil, I have created a new form of medicine and surgery
based on the foundations of experience, the supreme teacher of all things.”
One day, Paracelsus ceremoniously threw the epitome of classical
medicine, the great Canon of Avicenna into a bonfire. Soon, Paracelsus
was at loggerheads with not only the physicians and pharmacists but
even the judiciary of Basel. Warrants were issued for his arrest, but
Paracelsus fled in the middle of the night with a few things he could
carry. For the remaining part of his life, he was on the run.
Paracelsus wandered from town to town, alone and penniless, but
continued to practice medicine, perform alchemical experiments and,
write. He used procedures of alchemy—extraction of pure metal ores,
and the production of powerful solvents, evaporation, precipitation
and dis-tillation—to produce simple, pure medications. “Stop making
gold,” he advised, “instead, find medicines.” Some of the medicines he
created were active ingredients from medicinal plants; others were usable
compounds of metals such as antimony, arsenic, zink, and mercury. His
critics pointed out that many of the medicines he prescribed were toxic.
His reply was too advanced for that age: “All things are poisons, for
there is nothing without poisonous qualities. It is the dose which makes
a thing poison.”
Though he cured a large number of patients with his magical
medicines, Paracelsus could not save himself from death in mysterious
circumstances, at the age of forty-eight. Today Paracelsus is seen as a
52 | History of Medicine

great visionary, who was the first to attack the foundations of Galenic
medicine. He was an inventor of medical chemistry. He was the first to
suggest the use of mercury compounds in the treatment of syphilis. He
was also the first to suggest that the “miner’s disease” (silicosis) resulted
from the inhalation of metals.

GABRIELE FALLOPPIO
Gabriele Falloppio (1523-1562) (Fig.
3.10) was perhaps the most versatile and
outstanding of the 16th century anato-
mists. He is remembered for the first
description of a number of anatomical
structures and disorders like fallopian canal
(facial nerve canal in the temporal bone),
fallopian tubes arising from the uterus,
fallopian pregnancy, and fallopian
ligaments, etc.
He was interested in medicine from
his youth, but financial difficulties forced
him to become a priest in a church. This Fig. 3.10: Gabriele
job could not hold his interest for long and Falloppio
he became a pupil of a surgeon. Falloppio lived in extreme misery. Soon
after graduation, he tried his hands at surgery. His lack of aptitude in this
field was shown by the fatal outcome of a number of his cases. Then
onwards, he devoted his entire life to anatomy though he continued to
practice as a physician.
In 1549, he accepted the chair of anatomy at the University of Pisa.
During this period he dissected bodies of lions in the Medici zoo and
disproved the contention of Aristotle that the bones of lions are wholly
solid and without marrow. Due to jealousy, he was wrongfully accused
14th to 18th Century Medicine |53

to having performed human vivisection and asked to quit. In spite of the


charge, he was invited to chair the department of anatomy in the famous
Italian University of Padua. He took up the job in 1551 and remained
there till his death in 1562. This was the most productive period of his
career in anatomy.
Falloppio was a painstaking dissector and is remembered for the
precision of the description. He dissected not only corpses of human
adults but also fetuses, newborn infants and children. His most valuable
contribution was in the study of bones. He described the development
of human bones, especially of the cranial bones and the sternum as well
as the development of the hearing organs. He was the first to describe
the development of the teeth, the primary and secondary dentition. His
description of the human auditory apparatus was far superior to that of
Vesalius. He gave the first clear account of the cochlea, the semicircular
canals, the scala vestibule and tympani as well as the round and oval
windows.
Falloppio was the first to describe the clitoris, and asserted the
existence of hymen in virgins, a matter long under dispute. He also
described and named “vagina,” the structure earlier was known as the
cervix or neck of the uterus. He disproved the popular notion that the
penis entered the uterus during coitus. The results of his lifelong
anatomical investigations were published in a book “Observations
Anatomicae” in 1561, just one year before his death from pulmonary
tuberculosis.
By the middle of 16th century, syphilis had spread like an epidemic
all over Europe, including Italy. In his book on the “French Disease,”
Falloppio described the use of condoms as a preventive measure against
the transmission of syphilis. According to him, he advised the sheath
(condom) to 1,100 men, and none of them was infected.
54 | History of Medicine

AMBROISE PARE
Ambroise Pare (1510-1591), (Fig. 3.11) a
French barber surgeon, is considered the
father of modern surgery. At an early age of
thirteen, he became apprentice to a barber
surgeon. At 19, he became a “resident
dresser” (i.e. resident surgeon) at the Paris
Hotel Dieu, the finest teaching hospital of
Europe in those days. Soon after graduation,
he joined the French army. His skill and
compassion in treating the soldiers’ wounds
made him loved by his troops. Once he was
captured by the enemy and was about to be Fig. 3.11: Ambroise Pare
executed. However, in regards of his compassion for the sick and
wounded soldiers, his life was spared.
Pare’s pioneer work was chiefly in the treatment of gunshot wounds
and amputation surgery. In those days, all gunshot wounds were classified
as poisoned and hence cauterized by red hot iron or boiling hot oil. So,
the soldiers, already horribly injured, were further scalded. The siege of
Turin in 1537 found Pare facing too much carnage. His cauterizing oil
ran out and as a stop-gap arrangement, he applied an ointment made of
egg yolk, oil of roses and turpentine. To his surprise, this group of
soldiers suffered much less pain and recovered far more quickly. At that
time, he realized that the chief concern of a surgeon should be to ease
suffering of the patients, a point totally ignored by other barber-surgeons.
Now onwards, he created a new and humane concept of medicine and
surgery. He was humble enough to attribute the good results to God by
the statement: “I dressed the wound but God healed him.”
He published his observations on the new method in a treatise
named “Treatment of Gunshot Wounds.” In his own country, he was
14th to 18th Century Medicine |55

ridiculed, since the book was written in French and not Latin, the language
of the scholars and the elite of that era. However, the book was widely
acclaimed and translated into Dutch, Italian, English, German, Spanish,
and Japanese. During his more than 30 years service in the army, Pare
saved thousands of lives. Before his time, a severely wounded soldier
was left to die or his throat was benevolently cut to save suffering.
His popularity in the army reached the King, and Pare became the
official Royal Surgeon. In that capacity, he served four French Kings—
Henry II, Francis II, Charles IX, and Henry III. The King Henry II was
childless even 10 years after marriage. After treatment of hypospadias
by Pare; he became father of 10 children. Pare himself got married at the
age of 64 and had six children.
Amputation—cutting off a part of a limb has been practiced since
prehistoric times but only in desperate cases of crushed injuries. It was
a desperate measure because the patient was most likely to die of bleeding
from the stump. Therefore, the amputations were usually performed at
the level of necrotic tissue. Stumps were then cauterized with red hot
iron or boiling hot oil, which stopped bleeding and was supposed to
prevent rotting of the necrotic tissues. The cauterization was, of course,
extremely painful to the unfortunate patients. By Pare’s time, gun
powder had been invented and the gunshot injuries in the battlefield
were so devastating that amputations had to be commonly resorted to.
Even amputations at the thigh, which previously were rare, because of
usually fatal bleeding, were now a common necessity. Pare’s great
improvement in amputation surgery was tying off the blood vessels
(ligature) rather than cauterizing them.

WILLIAM HARVEY
William Harvey (1578-1657) (Fig. 3.12) was an English man. He studied
at Cambridge and obtained master’s degree in arts and then went to the
56 | History of Medicine

University of Padua (Italy) to study


medicine. After graduation, he came back
to England and married Elizabeth Brown,
daughter of the court physician to the
Queen Elizabeth and King James I. Thus,
he made contacts with the rich and
influential people of London and his
practice flourished. (Contacts with the
high and mighty were as useful in medical
profession in those days as are today).
Harvey quickly moved up the career
ladder and became court physician to the Fig. 3.12: William Harvey
King James I. While earning name and
fame, Harvey conducted lot of research in human physiology and
embryology. He is best known for his then revolutionary ideas on the
circulation of blood.
In those days the medical profession still believed in the concepts
of human physiology as proclaimed by Galen. According to Galen,
there were two types of blood, which flowed separate from each other.
The “nutritive blood” originated in the liver and flowed in the veins to all
parts of the body, where it was consumed. The other, “vital blood,”
containing “vital spirits” originated in the heart and flowed in the arteries.
The heart was believed to suck blood, not pump it. The lungs were
supposed to have a pumping function. Harvey carried out many
experiments, both dissections and physiological, on the animals. The
dissected heart showed that valves present in the heart allowed blood to
flow in one direction only. Secondly, there were no pores in the septum
as proposed by Galen. (Galen had no idea about pulmonary circulation).
When Harvey removed the beating heart from an animal, it continued to
beat as a pump. The human heart removed from cadavers showed its
blood volume to be about two ounces. By calculating the number of
14th to 18th Century Medicine |57

heart beats and multiplying it by two


ounces, he came to the conclusion that
body could not possibly make so much
blood per day. Circulation of blood was
the obvious answer. This theory was
further supported by the presence of
valves in the veins, demonstrated by him
(Fig. 3.13). These valves allowed flow of
blood towards the heart but not in the
opposite direction. On the basis of these
observations, Harvey proposed that there
was only one type of blood which
circulated continuously in the arteries and
veins and the heart acted as pump for the Fig. 3.13: Venous valves
blood circulation. “Blood moves in a circle.
The arteries are the vessels carrying blood from the heart to the body
and the veins return blood from the body to the heart. Blood is not
consumed in the tissues,” he concluded. Since microscopes had not been
invented by that time, Harvey did not know about the blood capillaries.
That is why; his theory could not explain how blood was transferred
from arteries to the veins.
According to his own account, Harvey conceived the idea of
circulation from the book, “On the Valves of the Veins,” published by his
teacher in anatomy named Fabricius Geronimo, in 1603. The book
described the structure of the venous valves, but explained their function
as to retard the flow of blood to enable the tissues to absorb nutrients.
Within the heart, Harvey described the sequential contraction of the
atria and the ventricles, the role of the mitral and tricuspid valves, as
well as, the role of the papillary muscle and the chordae tendinea in the
ventricles. In short, he described the circulation of blood in the various
chambers of the heart as accurately as known today. The difference in
58 | History of Medicine

the fetal circulation, as compared to that in the postuterine life, reported


by him, is no less brilliant.
Records show that Harvey had come to the conclusion about the
circulation of blood in 1615 but waited for 13 long years before he dared
to publish them in a book, “Anatomical Exercises on the Motion of the
Heart and Blood in Animals.” The reasons for the delay in publication
of his ideas on circulation are not difficult to understand. At the time of
Harvey, the teachings of Galen were considered as sacred as the Bible.
To challenge his views was considered suicidal for a physician. After
publication of the book, all his fears came true. His ideas were seen to
reject what has been taken for granted for 1400 years. There was a furor
in medical profession all over Europe. Not only he lost patients from his
practice, but even common people thought he was mad (“crack brained”).
All the physicians were against him and wrote books against him.
In his life time Harvey did not get any recognition for his pioneering
work. In stead, he met ridicule, poverty and neglect. His practice suffered
since patients were afraid of a doctor with such revolutionary ideas.
However, soon after his death, using a microscope, Malpighi
demonstrated the presence of blood capillaries in 1661. In the next 20 to
30 years time, Harvey’s circulation of blood was being taught in most of
the medical schools.
It is pertinent to note that an Arab physician and researcher Ibn al
Nafis (1213–1288) had described the circulation of blood 300 years
earlier than Harvey. Most of the facts on circulation described by Harvey
can be found in the book written by Ibn al Nafis. This discovery was
“rediscovered” or say demonstrated in 1628 by Harvey who (to the
annoyance of the Arab medical historians) is given all the credit. However,
Western historians point out that, Harvey might, or might not, have
been aware of the book of Ibn al Nafis. Even if he was aware, it was
Harvey who provided documentary proof of the experiments proving
the circulation of blood.
14th to 18th Century Medicine |59

THOMAS SYDENHAM
Thomas Sydenham (1624-1689)
(Fig. 3.14) was an English physician.
In England, he is known as the English
Hippocrates, and the father of English
medicine. He is remembered for the
revival of Hippocratic methods of
observations and experience in the
treatment of a patient. To him bedside
experience was far more valuable than
the study of anatomy and physio-
logy. His disdain for anything modern,
for those times, is shown by the
following statement: “Anatomy – Fig. 3.14: Thomas Sydenham
Botany—nonsense, Sir. I know an old
woman in Convent Garden who understands botany better, and as
anatomy, my butcher can dissect a joint full and well. No, young man,
you must go to the bedside, it is there alone you can learn disease.” He
refused to read the medical epics such as books of Andreas Vesalius
(Anatomy) or William Harvey (Circulation of Blood).
In 1663, Sydenham became a licentiate of the Royal College of
Physicians. He never became a Fellow of the Royal College of Physicians,
nor did he ever hold any office in a hospital or a Chair in a university. He
remained a private medical practitioner for greater part of his life, and
enjoyed an overwhelming reputation bordering on idolatry. Sydenham
had ample opportunity to study epidemics. He saw the Great Plague of
London, followed by a severe epidemic of smallpox. He wrote extensively
on these subjects. His first book was Fevers, published in 1666. It was
later expanded into a much larger book called Observations Medicae
(1676), a standard textbook of medicine in England for the next two
60 | History of Medicine

centuries. He also presented the theory of an epidemic constitution, i.e.


conditions in the environment (air, season, etc.) which cause the
occurrence of acute diseases. He himself suffered from gout.
His treatise On Gout (1683) is considered a masterpiece. He noted
the link between fleas and typhus fever. Sydenham introduced opium to
medical practice and was the first to use iron-salts in the treatment of
iron-deficiency anemia.
Sydenham helped to popularize the use of cinchona bark in the
treatment of malaria. Cinchona bark was already in use in England for
some decades but fell into disrepute because of toxic effects in some
patients and ineffectiveness, as shown by relapse of the fever, in others.
He determined the optimum dose and insisted that the dose be repeated
at regular intervals, even if the patient was afebrile after the first dose.
This regimen helped to prevent relapses of malarial fever.

QUOTES OF SYDENHAM
On Gout
• Among the remedies which it has pleased Almighty God to give to
a man to relieve his sufferings, none is so universal and as efficacious
as opium.
• For humble individuals like myself, there is one poor comfort, which
is this, viz. that gout, unlike any other disease, kills more rich men
than poor, more wise men than simple.
• Gout produces calculus in the kidney... the patient has frequently
to entertain the painful speculation as to whether gout or stone be
the worse disease. Sometimes the stone, on passing, kills the patient,
without waiting for the gout.
• Gouty patients are, generally, either old men, or men who have so
worn themselves out in youth as to have brought on a premature old
14th to 18th Century Medicine |61

age. They are of such dissolute habits, none being more common
than the premature and excessive indulgence in venery, and the like
exhausting passions.
• Great kings, emperors, generals, admirals, and philo-sophers, all
have died of gout. Hereby Nature shows her impartiality: since
those whom she favors in one way she afflicts in another—a mixture
of good and evil pre-eminently adapted to our frail mortality.
• I confidently affirm that the greater part of those who are supposed
to have died of gout, have died of the medicine rather than the
disease—a statement in which I am supported by my observations.

On Medical Practice
• Why! The Fever itself is Nature’s instrument.
• A man is as old as his arteries.
• In writing the history of a disease, every philosophical hypothesis
whatsoever, that has previously occupied the mind of the author,
should lie in abeyance.
• It is my nature to think when others read.
• A physician must remember that he himself hath no exemption
from the common lot, but that he is bound by the same laws of
mortality and liable to the same ailments and afflictions with his
fellows.
• Nature, in the production of disease, is uniform and consistent, so
much so, that for the same disease in different persons the symptoms
are for the most part the same; and the selfsame phenomena that
you would observe in the sickness of a Socrates you would observe
in the sickness of a simpleton.
• Nothing in medicine is so insignificant as not to merit attention.
• The arrival of a good clown exercises a more beneficial influence
upon the health of a town than of twenty asses laden with drugs.
62 | History of Medicine

• The art of medicine was to be properly learned only from its practice
and its exercise.
• Whoever takes up medicine should seriously consider the following
point: he must one day render to the Supreme Judge an account of
the lives of those sick men who have been entrusted to his care.

MARCELLO MALPIGHI
Marcello Malpighi (1628-1694)
(Fig. 3.15) was an Italian physician
and biologist, who directed his
microscope towards biological
investigations and became one of the
greatest microscopist of all times.
Many historians regard Malpighi as
the father of microscopic anatomy in
both animals and plants.
By the age of 25, Malpighi had
obtained doctorates in both medicine
and philosophy and appointed as a
teacher in the University of Bologna.
Malpighi pursued his microscopic Fig. 3.15: Marcello Malpighi
studies on animals and plants while
teaching and practicing medicine. In 1661, in his very first publication,
he announced his observations on the anatomy of the frog’s lung.
Malpighi described tiny, thin-walled microtubules, which he named
capillaries. He went on to hypothesize that capillaries are the link
between the arteries and veins that allowed blood to flow back to the
heart. Thus, his publication lent further support to the concept of
circulation of blood proposed by William Harvey in 1628. Malpighi’s
14th to 18th Century Medicine |63

views evoked increasing controversy and dissent, mainly from envy and
jealousy, on the part of his colleagues in the university.
In view of the hostile environment in the university, Malpighi
accepted the chair of professor of medicine, in the University of Messina
in Pisa, in 1662. Here, he identified the taste buds in the tongue and
regarded them as terminations of nerve fibers. He was first to observe
red blood cells under the microscope and attributed the color of blood to
them. He also studied chick embryo and gave detailed drawings of
different stages of development. Many microscopic anatomical
structures are named after Malpighi, including a skin layer (Malpighi
layer), Malpighian corpuscles in the kidney and the spleen as well as the
Malpighian tubules in the excretory system of insects. He was first to
discover and study the human fingerprints.
After dissection of a black male, Malpighi made ground-breaking
discovery in the cause of black skin. He reported the presence of black
pigment in the layers of the skin.
During the last decade of his life, Malpighi suffered because of
personal tragedies, and declining health. The opposition of his medical
colleagues to his views on medicine reached a climax. In 1684, his villa
was burned, his apparatus and microscopes shattered, and his papers,
books, and manuscripts destroyed. He fled to Rome, where the Pope,
Innocent XII appointed him as his personal physician.

ANTON LEEUWENHOEK
“THE FIRST MICROBIOLOGIST”
Anton van Leeuwenhoek (1632-1723) (Fig. 3.16) was actually a Dutch
cloth merchant. He was the first to see and describe bacteria in 1676.
In his time, the magnifying lens had been invented and was commonly
used by cloth merchants to examine the quality of the weave of a cloth.
While on a business trip to London in 1668, Leeuwenhoek saw drawings
64 | History of Medicine

of magnifications of cloth much greater


than he had ever seen in Holland. He
returned to his country and took up lens
grinding. Being meticulous and hard
working, he could produce lenses with
still higher magnification than seen in
London. He achieved a magnification of
X 300-500, using a single lens. Com-
pound microscope had been invented by
that time, but it gave a magnification not
more than 20-30 times.
To test the degree of magnification
achieved, Leeuwenhoek used to view
any object that could be placed under Fig. 3.16: Anton van
the lens. Initially, he recorded the Leeuwenhoek
magnified view of a cork, and bee’s mouth parts and stings. He hired an
illustrator to accurately draw the images seen under the lens.
Once, Leewenhoek was trying to study the cause of burning
sensation caused by pepper. Examining pepper under the microscope
(Fig. 3.17) he was expecting to see tiny spicules but found none. He

Fig. 3.17: Leeuwenhoek’s microscope


14th to 18th Century Medicine |65

mixed some pepper with water and tried to examine the mixture at
regular intervals. After a week or so, he was amazed to see distinct,
uniquely-shaped organisms, moving around “purposefully” in the drop
of water. Thus, the images of bacteria (he called them “animalcules”)
were recorded for the first time (Fig. 3.18). He sent his observations
along with the drawings to the Royal Society of London, in 1676. The
report was totally disbelieved by the Royal Society. Leeuwenhoek had
to send written documents from a vicar of London, as well as jurists and
doctors, confirming that the report was based on true observations.
Leeuwenhoek spent the rest of his life in exploring various form of
animals and plants under the microscope. He reported the presence of

Fig. 3.18: Animalcules


66 | History of Medicine

animalcules in the plaque between the teeth. He described the structure


of the blood cells, muscle fiber, and blood flow in capillaries, human
louse, etc. He was the first to describe the structure of the spermatozoa
(which he called sperm animals) of humans, dogs and other animals.
Gradually, he became such an important figure in scientific circles that
he was elected a full member of Royal Society of London, in spite of the
fact that he had no university education. Large number of scientists, and
many trendy high society people including the Czar of Russia, and the
Royalty of Europe, visited him to view the amazing animalcules. He
treated every one the same and did not allow anyone to touch the micro-
scope. In spite of the discovery of bacteria by Leeuwenhoek in 1676,
the scientific community failed to see any relation between microbes
and disease. It took another 200 years for the science of microbiology to
take off.

STEPHEN HALES
Stephen Hales (1677-1761) (Fig. 3.19)
was an English clergyman with great
interest in science. He had studied
botany and chemistry in Cambridge
University. He made pioneering investi-
gations in plant physiology, as well as,
in cardiovascular and respiratory
physiology. Hales introduced new
techniques in the study of plant physio-
logy. He demonstrated the loss of water
vapor from the leaves and that this
process encouraged a continuous upward
flow of water and dissolved nutrients Fig. 3.19: Stephen Hales
14th to 18th Century Medicine |67

from the roots. He demonstrated the upward flow of sap and measured
the sap pressure.
In the history of medicine, he is
best remembered for the estimation
of blood pressure in a horse. In 1731,
an unanesthetized horse was forcibly
put to the ground and brass tubes were
inserted into a carotid artery and a
jugular vein. The blood pressure was
shown by the height to which blood
column rose in glass tubes connected
to the brass tubes (Fig. 3.20). By this
method, he was able to show for the
first time that the arterial blood
pressure was much higher than the
venous blood pressure, and that the
arterial blood pressure varied with the Fig. 3.20: Recording BP
heartbeat (nowadays known as of a horse
systolic and diastolic blood pressures). In his experiment, the arterial
pressure of the horse was recorded as eight feet and three inches. He also
determined the volume of the heartbeat (stroke volume) and output of
the heart per minute. For this purpose, he bled a sheep to death and then
led a pipe from the neck vessels into the still-beating heart. Through
this, he filled the heart with molten wax and measured the volume of the
resultant cast. Thus, Hales may be considered the first experimental
physiologist. In view of the revolutionary discoveries, Hales is
remembered as the second notable figure after Harvey in cardiovascular
physiology.
Hales made numerous and notable contributions to the respiratory
physiology, as well. He demonstrated that rebreathing from a closed
circuit could be extended if suitable gas absorbers were included.
68 | History of Medicine

(CO2 was not known at that time). He measured the size of pulmonary
alveoli, calculated the surface area of the interior of the lung and measured
intrathoracic pressure during normal and forced breathing. He devised a
ventilator by which fresh air could be conveyed into closed spaces like
jails, hospitals and ships’ holds. He also invented an apparatus for
distillation of sea water.

INGENIOUS INVESTIGATIONS
ON DIGESTIVE PHYSIOLOGY
Even up to 1700s, people had no idea what happened to the food in the
gastrointestinal tract. Some scientists thought the stomach and intestines
acted like grinding machines, grinding food into little particles. This idea
was based on the observation of an Italian, named Franceso Redi. He
forced birds to swallow glass balls which in deed crumbled in the bird’s
digestive organs. Others thought the food fermented or simply rotted in
the intestines.
Rene Reaumur (1683–1757) was a French physiologist. He trained
his pet, a bird, to swallow a small perforated tube full of sponge or
various types of food and subsequently regurgitate it. The gastric juice
obtained by squeezing the sponge was used to study its effect on various
foods. He also found that in the stomach, the digestion of meat was far
greater than that of starch.
Lazzaro Spallanzani (1729–1799) conducted the most daring
experiments in digestive physiology. Initially, he experimented on various
animals like chickens, crows, pigeons, frogs, fish, sheep, horses, cats,
dogs and even snakes, and realized that various animals have different
ways of digestion of food. He pushed hard objects such as tin tubes
stuffed with food, down the animal’s throats, and looked for the effects
on the food stuffs when the tubes passed out with the feces. Most of
the animals did not like the procedure and often fought back. One snake,
14th to 18th Century Medicine |69

fortunately nonpoisonous, bit him. Falcons and eagles attempted to


attack him. When the dogs tried to bite, he decided to hide the food tubes
in pieces of meat and throw those on the floor near them.
After the experiments mentioned above, Spallanzani wanted to
experiment on humans. Although he never thought that the animals felt
any pain during his experiments, he was aware that experiments would
not be acceptable to any human. Therefore, he thought it best to
experiment on himself. One morning, he chewed a bite of bread, spit it
out, and weighed it. He stuffed the chewed bread into a small bag made
of linen and closed the opening by a thread and needle. Then he
swallowed the bag. Twenty-four hours later the bag came out in the
feces. The thread and bag was intact but the bread was gone. Next,
Spallanzani swallowed food kept inside small perforated wooden tubes.
Next day, the wooden tubes came out with no sign of food in them. He
concluded that digestion depends on the digestive juices soaking into the
food. To study the effect of chewing, he put chewed piece of pigeon’s
heart into one tube and unchewed piece in another and swallowed both
the tubes. Next day he found that greater part of the chewed piece of
heart had disappeared than the unchewed piece. Finally, Spallanzani
swallowed thin tubes which crumbled with little pressure. The tubes
came out unbroken. He concluded that the digestive organs must be
handling food very gently, not grinding like a machine. To confirm the
impression, he swallowed four grapes without chewing. The grapes
came out intact! The human digestive system was clearly not a grinding
machine.

GIOVANNI MORGAGNI
Giovanni Battista Morgagni (1682-1771) (Fig. 3.21) was an Italian
anatomist, but celebrated as father of modern pathology. At the age of
16, he joined a famous university in Italy. His brilliance can be judged
70 | History of Medicine

from the fact that within three years


of admission, he was awarded
doctor’s degree in medicine as well
as philosophy. After graduation in
1701, he became an assistant of
Antonio Maria Valsalva, a well-
known demonstrator in anatomy.
Morgagni helped Valsalva, parti-
cularly in preparing his celebrated
book on the Anatomy and Diseases
of the Ear, published in 1704.
Subsequently, Morgagni himself
occupied the chair of anatomy and Fig. 3.21: Giovanni Battista
published a book on anatomy which Morgagni
particularly included observation of the larynx, the lacrimal apparatus
and pelvic organs of the female. As many as 12 structures of the body
are named after Morgagni, e.g. Morgagni’s caruncle in the prostate gland,
Morgagni’s columns in the rectum, Morgagni’s concha in the nasal cavity,
Morgagni’s hernia (a congenital parasternal or retrosternal diaphragmatic
hernia), etc.
Morgagni’s place in the history of medicine is because of a more
important book published by him at the age of 79 years. The book called
“Seats and Causes of Disease Investigated by means of Anatomy.” In
this book, he reported in precise and exhaustive details his findings of
640 autopsy dissections. For the first time a correlation was made
between the pathology found at postmortem and clinical findings. In this
book he introduces the concept that diagnosis, prognosis, and treatment
of a disease must be based on an exact understanding of the pathological
changes in the anatomical structure.
Morgagni has narrated the circumstances under which the book
“Seats and Causes —” took origin. Having finished editing a new edition
14th to 18th Century Medicine |71

of the book written by Valsalva, he was taking a holiday in the country,


spending much of his time in the company of a young man, who was
curious in many branches of knowledge, including medicine. The friend
suggested that Morgagni should record his own observations on the
patients seen by him, including the postmortem findings. Morgagni
agreed to do so and sent the records to his friend from time to time. Over
a period of two decades, the records accumulated to seventy communi-
cations. The friend insisted that the reports be published without
abridgement. Thus, at the age of 79, Morgagni published his most
prestigious massive two volume book. Notwithstanding its bulk, the
book was reprinted several times in its original Latin, and was translated
in French, English, and German.
Morgagni was first to describe cerebral gumma, diseases of the
heart valves particularly Fallot’s tetralogy, aortic coarctation, pneumonia
with consolidation and recorded the first case of the disorder subsequently
named Stokes-Adams’ syndrome.
Morgagni died of ruptured heart at the age of 89. His absolute faith
in the anatomical basis of study of disease is reflected in the following
quotation from his book: “For those who have dissected or inspected
many, have at least learnt to doubt, when others, who are ignorant of
anatomy, and do not take the trouble to attend to it, are in no doubt at
all.”
Morgagni’s work gave a final blow to the ancient concept of four
humors and their disturbance as a cause of disease. Morgagni’s contri-
bution to the understanding of disease may well rank with the
contributions of Andreas Vesalius and William Harvey. According to
Virchow “Morgagni introduced modern pathology, and with him begins
modern medicine.”
72 | History of Medicine

LEOPOLD AUENBRUGGER
Joseph Leopold Auenbrugger (1722-
1809) (Fig. 3.22) was a German
physician. He is remembered for the
development of a new clinical method
in medicine, namely, percussion.
By the middle of 18th century,
physicians began to look for diagnostic
methods other than inspection and
palpation known since ancient times.
A German physician by the name of
Leopold Auenbrugger came out with a
new method called percussion in a Fig. 3.22: Joseph Leopold
twenty-four paged monograph, Auenbrugger
“Inventum Novum.” Published in 1761, it was a fruit of seven years’
labor. He attributed the discovery to his boyhood experience of watching
his father, a wine merchant, tapping a barrel of wine to determine the
fluid level. Auenbrugger tapped the patients with his fingertips, with
hand drawn closed, to determine whether the sound was resonant or dull
which indicated the presence of air or fluid in the chest underneath.
In the normal chest, the lungs when percussed, gave a sound like a
drum over which a heavy cloth has been placed. When the lungs are
consolidated, as in pneumonia, the sounds resemble that produced by
tapping the fleshy part of the thigh. Auenbrugger also noticed that the
area over the heart also gave a modified, dull sound. In this way the heart
size could be delineated. He confirmed these observations by comparison
with postmortem findings. In addition, he made many experiments by
injecting fluid into pleural cavity of dead bodies. He showed that by
percussion it was possible to tell the exact limit of the fluid present and
decide where an effort should be made for its removal.
14th to 18th Century Medicine |73

Auerbrugger spent another ten years in investigating the results of


percussion in various stages of pulmonary tuberculosis. He showed
how percussion helped to locate the position and the size of tubercular
cavities in the lungs.
The following paragraph, given in the preface of Inventum Novum,
explains the fears of Auenbrugger about his discovery: “In making public
my discoveries respecting this matter (percussion), I have been actuated
neither by an itch for writing, nor a fondness for speculation, but by the
desire of submitting to my brethren the fruits of seven years’ obser-
vations and reflections. In doing so, I have not been unconscious of the
dangers I must encounter; since it has always been the fate of those who
have illustrated or improved the arts and sciences by their discoveries,
to be beset by envy, malice, hatred, detraction, and calumny.”
Auenbrugger had to face stiff opposition and ridicule for the method
advocated by him. Percussion, as a method of clinical examination,
received general acceptance only when Jean Stoll Nicholas Corvisart,
a prominent French physician, and physician to Napoleon Bonaparte,
advocated its use by writing a book in 1808. At a later stage, a percussion
hammer made of hard rubber came to be used for percussion in stead of
tapping with the fingertips.

JOHN HUNTER
John Hunter (1728-1793) (Fig. 3.23) was the son of a farmer of Scotland.
He showed little interest in studies and dropped out of school. By the
time John was 20-year-old, his elder brother, William Hunter, had
become a famous anatomist and surgeon in London, who ran a prestigious
anatomy school, as well. William Hunter gave his good-for-nothing
younger brother work as manager of the dissection room of the anatomy
school. Thus, John Hunter arrived in London, crude and uneducated.
74 | History of Medicine

His brother tried to educate him at


Oxford University, but formal
education could not hold his interest
for more than a few months. The
crude part of his personality never
left him. He never learned tact or
social grace. In spite of all these
failings, John Hunter rose to a
position of great eminence as an
anatomist and surgeon.
In his brother’s anatomy school,
John Hunter showed a great talent
in dissection and became an
exceptionally good technical anato- Fig. 3.23: John Hunter
mist. During this period, he came in contact with a number of surgeons
being trained in the art by his elder brother. Thus, he got interested in
surgery, as well. In 1760, he was commissioned in army, where he took
care of the wounded soldiers, but also continued with his experimental
work. At the age of 40 years, he passed the examination for the diploma
of the Company of Surgeons and joined a large hospital as a surgeon.
John Hunter is remembered nowadays as an anatomist and a surgeon.
In fact, he was a pioneer in comparative anatomy, experimental surgery,
and experimental dentistry. He created a museum of human and
comparative anatomy, which by the time of his death contained 14,000
specimens. Eventually, this museum was acquired by the Royal College
of Surgeons of England and is still preserved. In the most famous
experiment, John Hunter infected himself with syphilis and studied the
course of the disease and its treatment with mercury and cautery. His
experience of the disease is described in a book “Treatise on the Venereal
disease” which became a classic. Some other books written by him
14th to 18th Century Medicine |75

include “A Treatise on the Blood, Inflammation and Gun-shot Wounds”,


and “The Natural History of Human Teeth.”
John Hunter had a terrible temperament. In later years of his life,
Hunter used to develop an attack of angina whenever he lost his cool. He
is quoted as saying: “My life is at the mercy of any rogue who chooses
to provoke me.” Sure enough, he died of a heart attack after a stormy
meeting of the board of directors of the hospital where he worked.

ROLE OF CRIMINALS IN THE


DEVELOPMENT OF ANATOMY
Human dead body has been considered sacred in most of the societies
since time immemorial. Dissection of a human corpse was illegal in
England till 1832, when the Anatomy Act was passed. Before that, law
permitted dissection of only dead bodies of criminals hanged for various
offences. Since knowledge of anatomy was essential for the training of a
surgeon, the shortage of corpses for dissection remained a chronic
problem. Hence illicit trade in corpses was prevalent in England. A
group of criminals not only obtained dead bodies from the hospitals
claiming to be relatives of the deceased, but also by digging fresh graves.
The body snatchers were nick-named as “resurrection men” by the
press. Fights in the graveyards between rival gangs were fairly common.
These criminals were hand-in-glove with the teachers of anatomy and
surgeons. A surgeon would go to any extent to obtain dead body of an
“interesting” case, which died in his care. The trade in corpses was so
lucrative that people were often murdered to maintain regular supply of
dead bodies.
The Anatomy Act of 1832 permitted dissection of human bodies,
but only after a proper death certificate and a consent of a relative, if
any. Gradually, the situation changed so much that volunteering one’s
76 | History of Medicine

body for dissection after death came to be considered an act of


benevolence.
In contrast, dissection of human corpses was allowed in Italy from
1250 onwards. There are official records of postmortems conducted to
find out the cause of death. Consequently, Europeans were better informed
about human anatomy than the British. For centuries, surgeons of England
flocked to Italy to gain knowledge of human anatomy.

MATTHEW BAILLIE
Matthew Baillie (1761-1823) (Fig. 3.24)
was a famous physician and anatomist in
London. He is remembered for the first
text-book of pathology written by him.
On the advice of his uncle, William
Hunter, the famous surgeon and anatomist,
Baillie chose medicine as his career. As a
medical student, he assisted William
Hunter in carrying out anatomic demons-
trations and supervising dissections
undertaken by other medical students.
Baillie graduated in 1786 and became a
Fig. 3.24: Matthew Baillie
Fellow of the Royal College of Physicians
in 1790.
In 1783, on the death of William Hunter, Baillie inherited a sum of
5000 pounds, Hunter’s house and the anatomy museum. Baillie took on
William Hunter’s anatomy lectures and proved a successful teacher. He
became deeply interested in morbid anatomy (pathology). His
demonstrations were remarkable for their clarity and precision.
In 1791, after getting FRCP, Baillie started practice in medicine. As
a physician, Baillie was famous for his clinical acumen, the clarity and
14th to 18th Century Medicine |77

simplicity with which he expressed his opinion. In 1793, Baillie published


the work for which he is famous, “The Morbid Anatomy of Some of the
Most Important Parts of the Human Body.” It was the first book on the
morbid anatomy as a subject itself. Rather than giving the history and
symptoms of every case, as had been the trend at that time, Baillie
discussed the morbid appearances of each organ in different disorders.
The book contained all the information on the subject of morbid anatomy
available at that time. The book was published in English, not in Latin,
as was the custom in those days. It was immediately translated into
French and German.
For years, Baillie worked for sixteen hours a day. Ultimately, the
large practice overwhelmed him and he developed tuberculosis and died
in 1823, at the age of 62.

JOSEPH PRIESTLEY
Joseph Priestley (1733-1804)
(Fig. 3.25) was a British priest who
made important discoveries of many
gases, including carbon dioxide,
oxygen, carbon monoxide, nitrous
oxide, etc.
To begin with, Priestley was not
at all interested in science and had no
formal education in this subject. He
was a pastor in a small church in
Leeds. His interest in this field was
aroused during a visit to London, in
1766, when he had a chance to meet
Benjamin Franklin, one of the most Fig. 3.25: Joseph Priestley
prominent scientists of that time. Priestley and Franklin became life-
long friends.
78 | History of Medicine

Priestley’s house in Leeds was close to a brewery. He was intrigued


by the smell of the “air” that floated from the fermenting grain. In his
first experiment, he was able to show that this brewery gas extinguished
lighted wood chips. He also noticed that the gas was heavier than normal
air, since it drifted to the ground around the vats. The gas was later
identified as carbon dioxide. Priestley even devised a method to produce
the same gas, “the heavy gas,” as he called it, in his home laboratory.
When the gas was dissolved in water, it had a very pleasant and tangy
taste. This observation led to the development of soda water. For the
invention of soda water, he was elected to the French Academy of
Sciences in 1772, and also received a medal from the Royal Society,
London, in 1773. (Some people may like to remember Priestley as the
father of colas).
In 1772, Priestley was the first to demonstrate the pheno-menon of
photosynthesis—that plants take in carbon dioxide and release oxygen.
The same year, he produced the gas nitrous oxide, which later became
known as laughing gas. Many decades later, the laughing gas was to
become the first anesthetic agent.
In 1774, Priestley was able to produce a gas from mercuric oxide,
which made a candle burn more brightly. All other gases that he had
tested extinguished the flame. This was the first time that oxygen was
discovered, though he did not realize the importance of the discovery,
and did not name it as oxygen. In addition, he also isolated and described
the properties of ammonia, sulfur dioxide, and carbon monoxide.
Priestley’s nonconformist religious and political views eventually
led him into a big trouble. He published a book, “History of Corruptions
of Christianity,” in 1785 in which he criticized the Church and openly
supported the French and the American Revolutions. Angry mobs burnt
down his home and church in 1791. He and his family had to flee to the
United States in 1794, where he received an enthusiastic welcome.
14th to 18th Century Medicine |79

WILLIAM WITHERING
William Withering (1741-1799) (Fig.
3.26) was a British physician and a
botanist. He graduated in medicine from
Edinburgh in 1766. He is remembered
for his painstaking work in the use of an
herbal medicine, foxglove, for the
treatment of dropsy, as well as books on
plants and ores.
Early in his career, Withering fell in
love with a botany illustrator. From the
historical records, it is not clear whether
he got interested in botany for her sake,
or botany was his chief interest that
attracted her to him. In any case,
Fig. 3.26: William Withering
Withering published a book in 1775,
entitled “The Botanical Arrangement of all the Vegetables naturally
growing in Great Britain.” This book went into many editions and made
Withering famous in the international scientific circles.
Besides his interest in Botany, Withering continued his medical
practice. He had strong sympathies for the poor and he became associated
with a hospital for the poor. Thus he was able to gain a vast clinical
experience and became a famous physician in Bermingham. One of his
private patients was a case of severe dropsy (known as congestive heart
failure these days). He told the patient that his disease is so advanced
that there was no possible treatment, and that he was unlikely to live
more than few weeks. Some months later, he was amazed to see the
patient alive and in a fairly good condition. On enquiry, he was informed
that he had consulted a village woman who sometimes cures people
80 | History of Medicine

after qualified physicians had failed. For a financial consideration, the


woman revealed that she gave a mixture of about 20 herbs. The botanically
trained Withering noticed that one of the herb, foxglove, was possible
responsible for the cure.
Foxglove is a plant with beautiful
flowers found as a wild plant all over
England (Fig. 3.27). Its leaves were known
to have toxic effects on the animals. The
foxglove leaves were also used as a folk
medicine for a variety of ailments such as
adenitis, bronchitis, fevers and even
tuberculosis. Withering started using
foxglove leaves in all his cases of dropsy,
and maintained a detailed record. He
published his experience of the herbal drug
in 163 cases of dropsy, in 1785. The book
was entitled “An Account of the foxglove
and Some of Its Medical Uses; with Fig. 3.27: Foxglove
practical remarks on Dropsy and Other
Diseases.” In the preface of the book, Withering wrote:
“It would have been an easy task to have given selected cases,
whose successful treatment would have spoken strongly in favor of the
medicine, and perhaps been flattering to my own reputation. But Truth
and Science would condemn the procedure. I have therefore mentioned
every case—proper or improper, successful or otherwise.”
In the book, Withering was bluntly honest about his own mistakes
in determining the effective and safe doses of the drug. In almost diary
fashion, he confessed to years of overmedicating patients to dangerous
levels with a host of side effects, sometimes leading to more rapid death.
When Withering died in 1799, his friends got a bunch of foxglove carved
on his memorial.
14th to 18th Century Medicine |81

The active ingredient of foxglove was isolated by a French pharmacist


in 1869 and called “digitalin”. Six years later, a German chemist isolated
the pure glycoside and named it “digitoxin.” For over 100 years, the
glycoside, later called digitalis, continued to be extracted from foxglove
leaves and remained the most important cardiotonic drug in cases with
congestive heart failure. Only recently, discovery of other less toxic
cardiotonic drugs has led to its disuse in most of the cases of congestive
heart failure. Digitalis still has a role in the treatment of heart failure
associated with certain types of cardiac arrhythmias.

EDWARD JENNER
Edward Jenner (1749-1823) (Fig. 3.28)
was an English doctor, the pioneer of
smallpox vaccination and the father of
immunology. He also made significant
researches in biology.
In the 18th century, smallpox was
rampant all over the world. It was greatly
feared because it killed one in three of those
who caught it and badly disfigured the rest,
who were lucky enough to survive the
infection. It has been estimated that about
60 percent of the population suffered
from smallpox. Thus, it was one of the
commonest cause of death in the popu-
lation, especially in infants and young Fig. 3.28: Edward Jenner
children. (Syphilis, known as big pox was a deadly disease for the adults
in those days).
Edward Jenner started medical practice in a small village of England
after a short stint of apprenticeship under a physician. In 1768, while
82 | History of Medicine

examining a country woman, he asked about any history of smallpox


infection in the past. Her reply was: “I cannot take that disease for I
have had cowpox”. (Cowpox is a viral disease causing ulcers on the
udders of the cows. Similar ulcers may develop on the hands of those
who milk such cows. The patient develops vesicles on the hand or
fingers or wrist which suppurate as circular ulcers that heal in a few
weeks). This concept was widely believed by the country folks but no
credence was given by the medical practitioners of those times. Her
statement laid the foundation of the work Jenner was to pursue for the
greater part of his life.
In 1770, Jenner went to London, to complete his medical education
under a famous surgeon, John Hunter. The teacher and the pupil
developed a life-long friendship and continued to exchange letters on
mutual scientific interests. In contrast to the ridicule heaped upon him
by the fellow country medical practitioners, Dr Hunter encouraged him
to investigate the idea of the possible protective role of cowpox against
smallpox. Hunter’s famous advice was “Don’t think, try the experiment.”
On completion of his medical education Jenner was offered a position
on one of the world expeditions by Captain Cook. However Jenner
rejected the offer, since he was obsessed with his idea of research on
smallpox.
Jenner restarted medical practice in his small native village. Soon he
developed a busy practice, visiting patients, on a horseback over an area
of 400 square miles. For the next 16 years, he kept records of all the
milkmen and milkmaids who suffered from cowpox. None of them
suffered from smallpox. In 1796, Jenner felt bold enough to attempt the
first experimental vaccination. Pus taken from the hand of a milkmaid
Sarah Homes, a patient of cowpox, was inserted into the arm of healthy
eight years old boy. The boy developed cowpox. Six weeks later, matter
taken from a smallpox pustule was injected into the arm of the boy. The
boy did not catch smallpox. After that, Jenner’s entire life was devoted
to smallpox vaccination.
14th to 18th Century Medicine |83

When Jenner had made 13 such experiments successfully, including


his own 11-month- old son, he requested the Royal Society of London,
in 1798, for permission to present his work. In reply the president of
the society rejected the request and advised him not to risk his reputation
by presenting such an “absurd work.” Jenner chose to publish the
results in the form of a monograph, “An Enquiry into the Causes and
Effects of Variolae Vaccina,” in 1799. Jenner had to face the prejudice of
the medical physicians who dominated London. They could not accept
that a country doctor could make such an important breakthrough in
medicine. In 1802, Jenner petitioned the British Parliament for
recognition of his work. In spite of a bitter opposition by the medical
practitioners of London; his work was accepted as genuine and awarded
10,000 pounds by the government.
Jenner was one of the few lucky research workers in medicine of
those days whose discovery was accepted in his life time. Within years,
vaccination was practiced widely in England, Europe and the USA. In
1813, Jenner was awarded the degree of MD by the University of
Oxford. In 1821, he was appointed Physician Extraordinary to King
George IV.
Though, smallpox vaccination was widely practiced, large areas of
Africa and Asia did not have the benefits of proper medical care. Smallpox
continued to disfigure and kill people. As late as 1967, two million
people died of smallpox. As a result of WHO’s worldwide vaccination
program, smallpox was totally eradicated from the face of the earth by
1980. This is one of the few success stories of the efforts of WHO in the
eradication of infectious diseases.
Besides his interest in medicine and smallpox, Jenner made important
discoveries in biology. He studied the phenomena of hibernation as well
as the migration of birds. His paper on the breeding habits of cuckoo
was found so extraordinary that, in 1789, he was elected as a Fellow of
the Royal Society, London, the greatest honour that could be bestowed
upon a scientist in those days.
84 | History of Medicine

Cuckoo is the unique among the birds in the way it uses other
species during the rearing of its young. It lays a single egg in the nest of
a bird of another species, most commonly, the hedge sparrow. When the
egg hatches, the foster-parents feed and raise the young cuckoo as if
their own. The eggs or the babies of the birds who built the nest disappear
mysteriously. Jenner set out to investigate how only the baby cuckoo
survives in each nest and why its parents adopt this strange way of
breeding. Jenner’s investigations revealed that the baby cuckoo pushes
out of the nest all the eggs or the young ones of the foster-parent bird, so
that it can get all the food brought in by the foster parent. This strange
nesting behavior was explained on the fact that the cuckoo appeared in
England in mid-April and left in 11 weeks, whereas it took 15 weeks for
the eggs to hatch and offspring is ready to fly away. Fostering was the
ideal solution. This hypothesis was proved correct only in the 20th
century, when the whole process was photographed.
In spite of the monumental researches in medicine in the Renaissance
period, the medical practice changed little. Still, there was no light on
the cause of most of the diseases. Even if diagnosed, there ware no
effective therapeutic measures. Leaches and blood letting were still the
standard treatment by all the physicians, including Harvey. Modern
medicine was still more than one hundred years away.
86 | History of Medicine

DEVELOPMENT OF THERMOMETER
Thermometer is probably the most widely used instrument in clinical
practice today. Increase in body warmth was known as a sign of ill-
health ever since the times of Hippocrates, but it was judged by the
physician merely by touching the forehead of the patient by hand.
There was no instrument to measure it. Though thermometers were
invented by the 17th century, it would be astounding to know that a
clinical thermometer was seldom used by the physicians till late in the
19th century.
Galileo, an Italian, invented the first thermometer, in 1592. It was
an air thermometer, consisting of a glass bulb with an attached tube
dipped in a liquid. When the bulb was warmed say by a hand, the air
inside was seen to expand and some of it escaped from the tube. When
the hand was removed, the air in the bulb contracted and the liquid rose
in the tube. The rise of liquid was proportionate to the degree of warmth,
but the tube was not calibrated. Galileo called the instrument
“thermoscope.” The instrument began to be used for the rough estimation
of the ambient temperature. Within a few years, Santorio, a friend of
Galileo, improved the thermoscope by adding a numerical scale. It was
the first air thermometer to be used in meteorological observations.
Santorio was also interested in physiological experiments. He devised
an “air thermoscope” which could be used to measure the oral temperature
in humans (Fig. 4.1).
The next significant advance in thermometry was the use of alcohol
and mercury instead of air, by Daniel Gabrial Fahrenheit (1686-
1736). His mercury thermometer consisted of a capillary tube partially
filled with mercury. The mercury was heated to expand and when it
reached the tip of the capillary, the capillary was sealed at both ends. He
used the scale now known after his name. Anders Celsius (1701-1744)
19th Century Medicine | 87

devised the centigrade scale. In this


scale, the temperature ranged between
the boiling point of water, at 0 and
freezing point of water at 100 degree.
Another scientist suggested that the
scale be reversed, i.e. the freezing point
of water be called 0 degree and the
boiling point as 100 degree. Thus the
Celsius scale was born and continues
to be used throughout the world.
However, for over a century, the
thermometer was seldom used by a
physician.
In 1868, Carl Wunderlich pub-
lished temperature recordings from
over 1 million readings in over 25,000 Fig. 4.1: The air thermoscope
patients; made with a foot-long thermometer used in the axilla. He
established the range of normal temperature from 36.3 to 37.5 degree
celsius. The problem with this thermometer and all the earlier designs
was not only their inconvenient size but also the fact that the reading
had to be taken while the thermometer was still in the axilla. When taken
out, it showed the temperature of the environment, not the patient. This
problem was solved by Aitkin in 1852. He devised a mercury instrument
with a bend in the capillary near the mercury reservoir, so that the
mercury did not drop back even when it was no more in the axilla. It was
left to Thomas Clifford Allbutt, in 1866, to design a convenient and
portable 6-inch mercury thermometer. Within a few decades, the
recording of the temperature of a patient became a routine clinical
investigation.
88 | History of Medicine

RENE LAENNEC
Rene-Theophile-Hyacinthe
Laennec (1781–1826) (Fig. 4.2)
was the greatest French physician
of 19th century. His most notable
contribution to medicine was the
invention of the stethoscope.
Laennec was a student of Jean
Stoll Nicholas Covisart. Like his
teacher, Laennec used to listen to
the heart sounds by application of
ear directly to the chest of the
patient. While examining a young
Fig. 4.2: Rene-Theophile-
female he could not come round to
Hyacinthe Laennec
using this method “because of age
and gender of the patient.” As a substitute to direct auscultation, he
rolled a paper into a cylinder and applied one end on the chest of the
patient and the other to his ear. He was surprised to find that the heart
sounds could be heard with better quality. Laennec started using a hollow
wooden cylinder and initially called it “the cylinder,” but later chose the
name stethoscope from Greek words stethos (chest) and scope (to look
at) (Fig. 4.3).
In a book, Treatise on Mediate Auscultation, 1819, Laennec described
the different sounds produced in various diseases of the lungs such as
various stages of bronchitis, pneumonia, and the most important of all
for those times, the tuberculosis. Abnormal heart sounds produced by
some valvular defects in the heart were also described in the book. The
observations on the sounds heard in the chest in terminally ill patients
were correlated with the postmortem findings. The book is considered
19th Century Medicine | 89

Fig. 4.3: Laennec’s stethoscope

to rank with the works of Vesalius, Harvey, and Hippocrates. Due to


exposure to many tubercular patients in his researches, Laennec himself
developed the disease, which killed him in 1826, at the age of forty-five.
Laennec’s studies on tuberculosis were monumental. He was first
to note that any part of the body could be affected by tuberculosis, and
the tubercular lesion was similar in each tissue. Earlier, they were thought
to be different diseases. He was the first to describe the cirrhosis of the
liver as well as the tumor, melanoma.
Use of the stethoscope gained gradual acceptance in medical
profession all over the world. Later in the same century, a rubber tube
was found more convenient and the familiar “two-ear” stethoscope was
invented by an American, George Cammann in 1852. By the beginning
of 20th century, the four cardinal methods of physical diagnosis, namely,
inspection, palpation, percussion and auscultation were firmly
established.
90 | History of Medicine

Quotes of Laennec
• Do not fear to repeat what has already been said. Men need the
truth dinned into their ears many times and from all sides. The first
time makes them pick up their ears, the second register, and the
third time, the information enters the ears.
• I risked my life, but the book I am going to publish will be, I hope,
useful enough sooner or later to be worth the life of a man.

ADVANCES IN PHYSIOLOGY OF DIGESTION


William Beaumont (1785-1853) (Fig.
4.4), an American surgeon gained inter-
national recognition in the physiology of
digestion. His contribution was a number
of experiments conducted by him on a
patient who became known as “the man
with a hole in his stomach.” For centuries,
the stomach was thought to produce heat
that somehow cooked the ingested food.
Alternatively, the stomach was supposed
to act as a grinding mill, or a fermenting
vat. Through his observations on one
patient, Beaumont revolutionized the Fig. 4.4: William Beaumont
concepts of human digestion.
Beaumont was born into a poor family of farmers. He had little
formal education. After apprenticeship with a physician, Dr Benjamin
Chandler, Beaumont became a surgeon in the army. In 1822, he was
posted in a remote island on the frontier of the USA, close to the
American-Canadian border. A 19-year-old boy, Alexis Martin, was
brought to him, who had been accidentally shot from a close range. The
19th Century Medicine | 91

wound in the anterior wall of the stomach was of the size of the palm of
a man. It took one year for the wound to heal. Still, a small fistula in the
stomach remained, which was held partly covered by a flap of tissue
around the opening. The hole in the stomach provided a unique
opportunity to perform a large number of experiments on the gastric
function. Beaumont introduced various types of foods into the stomach
and took out samples after various intervals. These samples as well as
gastric juice obtained from empty stomach were sent to several chemists,
who provided the chemical analysis of the samples. Beaumont was, for
the first time, able to show the presence of hydrochloric acid in the
gastric juice. The action of acid was so strong that even the hardest bone
could be dissolved. He observed that vegetables were less easily digested
in the stomach than other foods. He also observed that the secretion of
gastric juice occurred only after ingestion of food; empty stomach
secreted very little juice.
These studies were made possible because Beaumont had appointed
Martin as his household servant. When Beaumont began to conduct
lecture tours with Martin as a showpiece, the latter proved difficult.
Ultimately, Martin signed a contract for a fee, giving Beaumont exclusive
rights to perform and demonstrate experiments on him. All these
observations resulted in a book published by Beaumont in 1833,
“Experiment and Observations on the Gastric Juice and Physiology of
Digestion.” Thus, despite lack of any formal training in medicine,
Beaumont became a pioneer in gastric physiology and an outstanding
figure in the history of medicine.
Beaumont is given the credit of the following original observations
in gastric physiology:
i. An accurate and complete description of gastric juice.
ii. The presence of hydrochloric acid in the gastric juice.
iii. The establishment of the profound influence of mental
disturbances on the secretion of gastric juice and on digestion.
92 | History of Medicine

iv. Comparative study of the digestion in the stomach with digestion


outside the body.
v. The study of digestibility of different articles of food.
In the last decade of his life, Beaumont got involved in a bitter legal
battle due to a man whom he tried to save. A patient had been struck
with an iron cane. Beaumont tried to relieve the increased intracranial
pressure by removing a piece of skull bone by a trephine. In spite of his
best efforts, the patient died. It was argued in the court that the patient
died because Beaumont drilled a hole in the skull to perform experiments
on brain, just as he had left a hole in Martin’s stomach in order to do
experiments on the digestion.
Tom (Fig. 4.5) is a patient who is better known in the history of
gastrointestinal medicine than the physicians who wrote research articles
on his condition. In 1895, when Tom was 9-year-old, he gulped what he
thought was beer. What he actually drank was a mouthful of boiling hot
chowder. He did not spit it out for fear of punishment by his mother. He

Fig. 4.5: Tom


19th Century Medicine | 93

was treated in a hospital but the esophageal opening became permanently


closed due to extensive scarring. So a permanent opening was made in
his abdominal wall and the stomach, through which he was fed for the
rest of his life. The interior of the stomach was visible on the abdominal
surface. Wolff, S. and Wolff, H.G. reported observations on the effects
of various psychological factors on the gastric secretion and gastric
blood flow. Tom’s gastric fistula and observations thereof became famous
in medical world. Based on the observations on the single patient, the
peptic ulcer was attributed to aggressive personality and /or psychogenic
stress or ingestion of hot and spicy foods. This theory held grounds for
half a century till the role of Helicobacter pylori was discovered.

FRANÇOIS MAGENDIE
François Magendie (1783-1855) (Fig.
4.6) was a French physiologist who is
considered pioneer in experimental
physiology. He also made significant
contributions to pharmacology and
pathology. He is remembered for the
discovery of the function of the
anterior and the posterior roots of the
spinal cord (Bell-Magendie law),
discovery of Foramen of Magendie in
the fourth ventricle of the brain and the
introduction of drugs such as
strychnine, morphine and emetine into
medical practice.
Magendie was son of a famous
surgeon, who was influenced by the Fig. 4.6: François Magendie
teachings of Rousseau. As a result,
94 | History of Medicine

Magendie reached the age of 10 without having attended any school or


having learnt to read or write. At that age, he entered a school and
showed rapid progress. At the age of fourteen, he won the grand prize in
a national essay-writing contest. In 1808, Magendie qualified in medicine
and was appointed anatomy demonstrator in Faculte de Medicine, Paris.
Due to his rude behavior, he antagonized his boss as well as other
colleagues and hence made to resign from the post within a few years.
Consequently, Magendie started private practice and organized a private
course in physiology. Only in 1826, at the age of 43, Magendie got a job
in a teaching hospital. In 1830, he became director of the women’ ward
in a famous hospital of Paris.
Magendie made many important discoveries in physiology and
other fields of medicine. His first physiological experiments were on
swallowing and vomiting; proving passive role of the stomach in vomiting.
He proved the role of the liver in detoxication process. In collaboration
with other workers, he introduced the recently discovered drugs like
strychnine, morphine, quinine, veratridine in the medical practice. These
were pioneering efforts in experimental pharmacology. He was also first
to observe anaphylaxis by injecting two doses of egg albumin in the
rabbit at short intervals. He even started publication of the first journal
exclusively devoted to physiology, Journal de Physiologie Experimentale.
His most important contribution to science was also his most
disputed discovery. Magendie, in France, and Charles Bell, in England,
were contemporaries. Both conducted a number of experiments on the
nervous system of animals. Bell showed that the anterior nerve roots of
the spinal cord contained motor fibers in 1811. About eleven years later,
Magendie performed more detailed experimental studies and showed
that the anterior roots of the spinal cord, in deed, contained carried
motor fibers and that the posterior roots carried exclusively the sensory
fibers. When these results were published in the Journal de Experimentale
in 1822, the traditional British-French rivalry flared up, leading to a
19th Century Medicine | 95

prolonged controversy. Ultimately the problem was resolved by giving


the credit to both, by naming the facts as Bell-Magendie law.
Magendie was also a notorious vivisector. He shocked many of his
contemporaries by brutal live dissections that he performed at public
lectures in physiology. Magendie did not mind performing most painful
experiments with no clear idea of what he expected to see. In one of such
lectures in England, Magendie tied an unanesthetized dog to the table
and dissected the nerves of half of its face and left overnight for further
dissection next day. This demonstration produced such uproar in England
that it led to the passage of the law against cruelty to animals and a ban
on experimentation on animals. (In Europe, the animal experimentation,
however, continued).

CHARLES BELL
Sir Charles Bell (1774-1842) (Fig. 4.7)
is considered the foremost British
anatomist, physiologist, and surgeon of
his day. He is remembered for the detailed
description of the seventh cranial nerve
and its disorder named after him (Bell’s
palsy), and the Bell-Magendie law.
When he joined the medical school
in Edinburgh, his elder brother, John Bell,
was a teacher in anatomy. Both the
brothers taught anatomy to the class and
published a book, “A System of Dis-
section Explaining the Anatomy of the Fig. 4.7: Charles Bell
Human Body,” even before the graduation of Charles Bell. The success
of John Bell’s anatomy classes aroused the jealousy of the members of
96 | History of Medicine

the faculty of medicine at the University of Edinburgh, who succeeded


in barring the two brothers from any position in the university.
In 1804, his career blocked in Edinburgh; Charles Bell went to
London, to try his luck. It was a bold step. In those days, owing to
political reasons, a Scottish degree was looked down upon in London. A
Scottish doctor was regarded as an interloper, out to share practice with
the British doctors, “the true sons of the soil.” Bell bravely held on in
his course and soon won for himself great esteem among the medical
fraternity of London. Being a successful artist, he indepen-dently
published a book “Anatomy of Expression in Painting” in 1806. This
book became extremely popular among the British painters of that era,
who used it as their textbook. One of his paintings described the condition
of a patient of tetanus so vividly that it impressed not only the painters
but also the fellow physicians (Fig. 4.8).
Bell’s most important work was in the fields of research on the
brain and the nerves. His book “New Anatomy of the Brain,” published

Fig. 4.8: Soldier with tetanus


19th Century Medicine | 97

in 1811, has been called the “Magna Charta of neurology.” Initially, it


was printed in a private printing press and all the copies were sent to his
acquaintances. In the book, he discussed the functions of the brain,
including the cerebellum and the spinal cord. He described the double
roots of the spinal cord and reported that irritation of the ventral roots
caused muscle cramps but disturbance in the dorsal roots produced no
visible effects. He correctly concluded that the ventral roots connected
the peripheral nervous system with the cerebrum, whereas, the dorsal
roots connected the periphery to the cerebellum, which was subsequently
proved wrong.
In 1822, Francois Magendie, a French physiologist, demonstrated
that the ventral roots of the spinal cord contain the motor fibers whereas
the dorsal roots are sensory. Bell had missed the latter fact because of
his dislike of vivisection. The traditional rivalry between the British and
the French led to great controversy as to whom the credit for the discovery
should be given. After a bitter controversy in the medical journals, the
problem was resolved by giving credit to both (the Bell-Magendie law).
This distinction of nervous traffic is considered the first important step
in the understanding of neurophysiology.

ROBERT GRAVES
Robert James Graves (1797-1853) was one of the famous Irish
physicians of his era. He is remembered for the first complete clinical
description of the thyroid disorder named after him (Graves’ disease).
After graduation in medicine from the University of Dublin, with a
brilliant undergraduate career, he studied medicine in London, Berlin,
Gottingen, Hamburg and Copenhagen. Thus, he had the advantage of
knowing the latest methods in medical education in Europe of those
days. Back in his own country, he started the innovative method of
bedside training of medical students. It was a novelty, because so far,
98 | History of Medicine

medical education was imparted through classroom lectures only.


According to him, “a medical graduate should not be a person who had
never practiced.” Another innovation was that he taught in English
rather than in Latin as was the custom in those days. Other innovations
were the use of a pulse watch and the practice of giving food and liquids
to the patients with fever instead of withholding nourishment. This
practice was so revolutionary at that time that Graves told his colleagues,
half in jest, that on his grave, the epitaph should read: “He fed fevers.”
Graves was the first physician to describe the clinical picture of
exophthalmic goiter, now called the Graves’ disease. In the description
of the disease, besides the bulging eye-balls, Graves gave prominence to
the changes in pulse and the heart sounds of the patient. According to
him, in one patient, the heart sounds were so loud that they could be
heard by the physician four feet away from the patient. However, it is
to his discredit that Graves could not identify the primary defect in
exophthalmic goiter. According to him, the thyroid enlargement was a
result of abnormal function of the heart! Besides hyperthyroidism,
Graves described the angioneurotic edema, and the pin-hole pupil in
pontine hemorrhage.
Graves was an expert in many European languages. Once he was
imprisoned in Austria for ten days on the suspicion of being a German
spy. His only fault was that he spoke German so well, “which was not
possible for an Englishman!”

RUDOLF VIRCHOW
Rudolf Ludwig Karl Virchow (1821-1902) (Fig. 4.9) is considered the
most important German physician of the 19th century. Virchow
pioneered the modern concept of pathological processes by application
of the cell theory to explain disease in tissues and organs of the body. As
19th Century Medicine | 99

a result of his hard work and deter-


mination, great strides were made in
the fields of pathological and physio-
logical medicine.
Virchow was as much a scientist
as a social activist. As a young
pathologist, in 1848, Virchow was
sent to Upper Silesia, a province under
German control, to investigate the
cause of the outbreak of an epidemic
of typhus fever. In his report,
Virchow blamed the German Govern-
ment for the epidemic. According to
his report, the root cause of epidemic Fig. 4.9: Rudolf Ludwig
was extreme poverty, lack of Karl Virchow
education, and the most unhygienic conditions in which the population
lived. His only plan of treatment and prevention was “full and unlimited
democracy for the region.” According to Virchow, his policy was for
prophylaxis, in contrast to palliative measures suggested by others. The
report enraged the German government to such an extent that Virchow
was fired from his job. Virchow showed no regrets and shifted his
research activities to the pathology department of the University of
Wurzburg in 1849. Within years, he became such an important figure in
German medicine that in 1856, he was brought back to Berlin as the head
of the newly created Institute of Pathology. Throughout his life, Virchow
continued to work for the betterment of social life and health care of the
German population. At a later stage of his career, Virchow was elected
to the Berlin City Council for exclusive work in the areas of public
health such as sewage disposal, hospital architecture, improvement of
meat inspection techniques, etc.
100 | History of Medicine

Virchow’s most important contribution to medicine was the cell


theory of disease. Since ages, it had been believed that the imbalance of
the four humors of the body (blood, phlegm, yellow bile, and black bile),
caused all types of diseases. The ghost of body humors was finally laid
to rest when Virchow showed that the cause of disease was a disturbance
of the function of tissue cells.
Virchow graduated in medicine in 1843. Within two years, Virchow
published a treatise on thrombosis giving the famous triad (Virchow’s
triad) of its pathogenesis. He sent many other papers for publication
but they were rejected, mainly because they contained ideas and concepts
far ahead of his times. As a retort, Virchow founded a new journal,
“Archives of Pathological Anatomy, Physiology and Clinical Medicine.”
The journal became the most prominent medical periodical of the time.
By the age of 25, Virchow had already discovered fibrinogen,
leukocytosis, and leukemia; had worked out the conditions which
predispose to thrombosis; and had explained and named pulmonary
embolism. He went on to describe the true nature of pus, necrotic cells,
sarcomas, red infarcts, amyloid, metastatic calcification, erythro-
phagocytosis, uterine fibromas, brown induration of the lungs, uric acid
nephropathy, and trichinosis. He discovered and named neuroglia,
gliomas, and giant cells. He coined the words “hyperplasia” and
“ischemia” and named many of the common tumors. His book “Cell
Pathology” published in 1859 became the foundation of all microscopic
studies of disease. It was through his efforts that anatomic pathology
became an essential component of curriculum in medical schools. He
wrote over 2000 research papers and books.

JULIUS COHNHEIM
Julius Cohnheim (1839-1884) (Fig. 4.10) was a German pathologist,
who is considered a founder of experimental pathology. He made
significant contributions to cardiovascular physiology and pathology.
19th Century Medicine | 101

After graduation in medicine,


Cohnheim became chief assistant of
Rudolf Virchow. Till that time
pathology was studied as morbid
anatomy—study of pathological
changes in organs obtained post-
mortem. Cohnheim started the study
of pathological changes as they
developed during the course of a
disease, an endeavor now known as
“Pathophysiology.” For example, in
1860s, he investigated the pheno-
menon of inflammation as it
developed rather than after the
tissue was dead. After a series of
Fig. 4.10: Julius Cohnheim
elegant experiments, he concluded
that acute inflammation involved the migration of leukocytes from the
blood capillaries to the site of injury and resulted in what was earlier
known as “pus.” This work was a challenge to the interpretation of
inflammation by Virchow. Later studies showed the truth of Cohnheim’s
study.
Cohnheim made several contributions to cardiovascular physiology.
He characterized the cardiac systole as a “force pump,” and cardiac
diastole as a “suction pump.” Thus, he considered even the cardiac
diastole an “active process.” More than 100 years later, research workers
in cardiology began to study the abnormal diastolic function in many
clinical settings, especially heart failure.
Cohnheim was especially interested in the pathophysiology of
valvular heart disease. He performed a series of experiments to elucidate
the mechanism of left ventricular hypertrophy or dysfunction in cases
with aortic stenosis or insufficiency. He also made numerous studies on
102 | History of Medicine

thrombosis, embolism, and infarction. All such studies resulted in an


exhaustive two volume book “Lectures on General Pathology,” which
was second most influential 19th century text on the subject (after
Rudolf Virchow’s classical book on cellular pathology).
During the last decade of his life, Cohnheim suffered from gouty
arthritis that confined him to a bed. He died from complications of the
disease in 1884, at the age of 45. A pathologist to the end, he allowed
postmortem dissection of his body to reveal the pathological changes in
gouty arthritis.

ALOIS ALZHEIMER
Alois Alzheimer (1864-1915) (Fig.
4.11) was a German psychiatrist and
a pioneer in neuropathology.
After his graduation in medicine,
Alzheimer started his career as a
junior physician in a mental
asylum. Franz Nissl was another
psychiatrist working in the same
hospital. Alzheimer and Nissl
embarked on extensive investigations
on the pathology of the nervous
system, particularly of the cerebral
cortex. Their work resulted in the
major 6 volume publications Fig. 4.11: Alois Alzheimer
(Histological and Histopathological Studies on the Cerebral Cortex)
from 1906 and 1918. Alzheimer concentrated his efforts on the clinical
aspects of nervous diseases whereas Nissl was an expert in
histopathological work and staining techniques. At first, the duo worked
on the pathological changes of the brain in neurosyphilis. In those days,
19th Century Medicine | 103

about 10 percent of the patients in a mental asylum suffered from this


disease. In one of their papers, Alzheimer and Nissl discussed the
clinicopathological correlates in a series of 170 cases of neurosyphilis.
Much of what we know about this aspect of syphilis today has been
written by Alzheimer and Nissl. They also described for the first time
changes in the brain in cases of Huntington’s chorea, and arteriosclerosis
of the brain. Alzheimer also demonstrated that the cerebral cortical
neuronal loss seen in patients of epilepsy was an effect of repeated
attacks of epilepsy and not the cause of the disease.
Alzheimer discovered the disease named after him. Alzheimer’s
disease is one of the most devastating diseases which may occur in
humans. His first case was a young woman of 50 years age who showed
progressive features of senile dementia and died 5 years later. On autopsy,
histopathological studies conducted in collaboration with Nissl, he
demonstrated certain features in the case, not seen in cases of senile
dementia. The characteristic histological features (plaque and
neurofibrillar tangles in the cerebral cortex) were described by him in
1906. Even 100 years later, nothing significant has been added to the
histopathological description of Alzheimer’s disease.
Alzheimer attracted students from various countries to learn the
neuropathological work. He always carried a cigar, which he would put
down near the microscope of a student. By the end of the day, a cigar
stump could be seen on the work bench of each of the 20 students he
was guiding.
Like many great scientists, Alzheimer had problems with the academic
establishment. While others were talking of psychoanalysis, Alzheimer
was trying to establish an association between mental illnesses and
pathology in the brain. His detractors called him a “mere anatomist.” It
was an inordinate long time before he was awarded a full professorship
in Psychiatry. Today Alzheimer is rightly recognized as a founding
104 | History of Medicine

father of neuropathology. The authenticity of his work was never in


doubt. In those times (as even nowadays) of prolific publishing, where
everyone believes to have something of importance to say and where
many advertise the little thing they find over and over again, Alzheimer
never presented anything unless it was truly important.

FRANZ NISSL
Franz Nissl (1860-1919) (Fig. 4.12)
was a German pioneer in neuro-
pathology. He played a great role in
the development of Alzheimer’s
achievements as a neuroscientist. It
is difficult to imagine what either
man could have achieved alone. Nissl
demonstrated his innovative histo-
logical skill even as a medical student.
Participating in a competition, Nissl
employed alcohol as a fixative and
developed a staining technique,
which led to demonstration of a
number of previously unknown
constituents of the nerve cells. The
judge, a renowned neurologist, was
so much impressed that he invited Fig. 4.12: Franz Nissl
Nissl to his own research laboratory.
Of the many innovations in the staining techniques, the best remembered
is the Nissl stain which could, for the first time, demonstrate the Nissl
granules in the neuron cell bodies. Nissl added a great deal to the
understanding of nervous diseases by relating them to the observable
19th Century Medicine | 105

changes in glial cells and other components of the nervous tissue. Many
historians consider Nissl to be the greatest neuropathologist of his time.
Nissl was also responsible for the popularization of the use of
lumbar puncture introduced by another German neurologist of that
time, Heinrich Queckenstedt. He performed lumbar puncture so often
that he was nick-named “punctuator maximus.”
Nissl remained a bachelor all his life, spending most of the time in
the research laboratory. Once, a young neurologist wanted to negotiate
for a place in Nissl’s laboratory. Nissl was busy in the laboratory and
asked him to meet at his house at 12 O’ clock. At noon, the young doctor
reached his home but there was no sign of Nissl. After a long wait, his
house-keeper suggested that perhaps, Nissl meant midnight. In deed he
was seen by Nissl at midnight and the conversation lasted until daybreak.

GUILLAUME DUCHENNE
Guillaume-Benjamin-Amand
Duchenne (1806-1875) (Fig. 4.13) was
a French neurologist who is remembered
for the discovery of a number of
myopathies named after him.
Duchenne descended from a family
of fishermen traders and sea captains in
Northern France. Despite his father’s
efforts to induce him to follow the family
seafaring traditions, Duchenne preferred
to study medicine in Paris, under
Laennec. Duchenne had an average
academic career and failed to get an
academic post after graduation. He went Fig. 4.13: Guillaume
into general practice in his home town, Benjamin-Amand Duchenne
106 | History of Medicine

and flourished for ten years. Then his wife died during childbirth, which
was supervised by him. His mother-in-law accused him of professional
incompetence. As a result of bad publicity, all his patients deserted him,
and he returned to Paris, penniless.
Duchenne found the medical fraternity of Paris rather hostile. He
was ridiculed for his provincial accent and his coarse manners. He started
working in many charity clinics and hospitals and soon built up a
moderately successful private practice. However, his chief interest was
not money. He lived for his patients and his scholarship.
Duchenne pursued his neurological studies in a very unorthodox
fashion. He used to haunt one or two major hospitals of Paris everyday
in order to study the most interesting cases, and make them object of his
electrotherapeutic studies. He was mocked by the interns and rebuffed
by the senior medical staff, whom he derisively called “Monarchs of the
ward.” Duchenne was a diligent investigator and a meticulous at recording
clinical histories. He did not mind following his chronic patients from
hospital to hospital to complete his studies. In this way, he collected an
exceptionally rich clinical material, far superior to that available to any
other clinician of his time. That is why in later part of his life, he won
recognition of other eminent neurologists of Paris. Jean-Martin Charcot
held him in great esteem and dubbed him “The Master.”
More than any other person in his day, Duchenne was responsible
for the developing the technique of meticulous neurological examination.
He also took an early interest in electrophysiology. Electric stimulation
of the muscles had been tried earlier but it was found to produce extensive
tissue damage. He devised the surface electrodes that were very safe. He
used external electric stimulation of the muscles, initially as a therapeutic
measure. Later he found its better use in electrodiagnosis of neuromuscular
disorders. He employed this technique to analyze the mechanism of
facial expressions, which he published and illustrated it by many striking
photographs (Fig. 4.14).
19th Century Medicine | 107

Duchenne was the first person to use


biopsy procedure to obtain tissues from
a living patient for microscopic exami-
nation. This was greatly denounced in the
lay press, but this was the only way he
could confirm the diagnosis of many types
of muscular dystrophies, during life of the
patients. He described, for the first time,
a number of familial muscular dystrophies
named after him.
Duchenne was the first to distinguish
between the upper and the lower motor
neuron types of facial nerve paralysis. He
was the first to differentiate the cerebellar
and sensory types of ataxia, and described Fig. 4.14: Duchenne and
a patient showing electri-
syringomyelia. On the basis of his
cally induced smile
electrophysiological studies, he was first
to suggest, without any histopathological confirmation, the profound
paralysis in poliomyelitis must be due to a lesion located in the anterior
horn cells of the spinal cord. This fact was subsequently confirmed
histopathologically by Charcot.

CARL LUDWIG
Carl Friedrich Wilhelm Ludwig (1816-1895) (Fig. 4.15) was a German
physician and a physiologist. His major contribution to physiology
was not only the large number of physiological phenomena discovered,
but also the apparatus he developed for physiological research. He is
particularly remembered for the invention, in 1857, of the drum
kymograph (Fig. 4.16), the apparatus responsible for initiation of
experimental physiology and pharmacology, especially in neuromuscular,
108 | History of Medicine

Fig. 4.15: Carl Friedrich Fig. 4.16: Drum kymograph


Wilhelm Ludwig

intestinal and cardiovascular systems. For the next 100 years, the drum
kymograph was as ubiquitous in physiology departments as the doctor’s
white coat in the hospitals. Unlike anatomic structures, physiological
phenomena are dynamic processes. They could not be recorded till
Ludwig came up with the mechanically revolving drum covered with a
smoked paper. Any movement of a mechanical lever placed against the
drum left a permanent impression on the smoked paper.
Ludwig was the first to keep animal’s organs alive outside the body.
For example, he could keep a frog’s heart beating for hours outside the
body by perfusing it with a solution similar in composition to frog’s
plasma. Ludwig discovered the vasomotor center in the medulla
oblongata, and was the first to measure the blood pressure in the
capillaries. He discovered the depressor and accelerator nerves of the
heart and formulated the “all or none law” of the heart.
19th Century Medicine | 109

Ludwig demonstrated the existence of a new class of nerves which


regulated the secretory function of exocrine glands. He showed that
electric stimulation of the nerves to the salivary glands led to profuse
salivary secretion even after the animal had been decapitated.
Modern theories of formation of urine and lymph stem from a
Ludwig’s paper (1884). In this paper, Ludwig postulated that the surface
layer of the renal epithelium of the initial part of renal tubules (now
known as Bowman’s capsule) served as a passive filter in urine
production, the rate of which is controlled by blood pressure. He also
introduced the measurement of urinary nitrogen as an index of protein
metabolism in the body. There is indeed scarcely any branch of
physiology, except the physiology of special senses, to which he did
not make important contributions. Ludwig is considered one of the great
physiology teachers; nearly 200 of his students became prominent
scientists.

JEAN-MARTIN CHARCOT

Jean-Martin Charcot (1825-1893) (Fig.


4.17) was a French neurologist and a professor
of anatomical pathology in the University of
Paris. He made significant contributions to the
developing fields of neurology and psychology.
In 1882, he established a neurology clinic in
the famous Salpetriere Hospital in Paris, which
was the first of its kind in Europe. He was the
first to describe a disorder which came to be
known as Charcot’s joints (or Charcot’s
arthropathy), a degene-rative disease of joint Fig. 4.17: Jean-Martin
surfaces resulting from the loss of Charcot
110 | History of Medicine

proprioceptive sensations due to neurosyphilis. He researched the


functions of different parts of the brain and the role of arteries in cerebral
hemorrhage. He is also remembered for the first description of the disorder
now named Charcot-Marie-Tooth disease as well as the amyotrophic
lateral sclerosis.
In 1862, Charcot was appointed senior physician at the 5000 bedded
Salpetriere Hospital. Charcot’s life revolved around the diagnosis and
classification of patients, correlating their clinical and pathological
findings. In 1882, he was appointed the first professor of neurology in
the University of Paris. The post and the department were created
especially for him. During this period, Charcot published a series of
research publications correlating the clinical picture of a patient with
neurological diseases and correlated them with the pathological reports
of the autopsy. These reports attracted worldwide attention among the
neurologists. Charcot’s triad is the name given to the three signs,
(intention tremor, nystagmus and scanning speech), seen patients of
cerebellar disease.
Charcot was the first to describe a disease which he called
disseminated sclerosis. The patient was a poor woman. In order to
follow the patient’s progress of the disease, Charcot employed her as a
maid in his house. Finally when she died, the diagnosis was confirmed
by histopathological examination of the spinal cord.
Charcot also localized the motor centers of the cerebral cortex, and
was the first to describe a patient of amyotrophic lateral sclerosis. He
also described the miliary aneurysms in cerebral vessels and the ankle
clonus.
It was perhaps his incomparable qualities as a teacher, writer and
organizer that contributed most to the great reputation as a gifted clinician.
As a teacher, he replaced the traditional hospital rounds with clinical
demonstrations on the flood-lit stage of the hospital amphitheater. Besides
other neurological demonstrations, Charcot was famous for the “hysteria
shows” (Fig. 4.18).
19th Century Medicine | 111

In 1872, Charcot initiated his work on


hysteria and hysterical hemianesthesia.
Charcot thought that he had discovered a
new disease which he called “hystero-
epilepsy”, a disorder of mind and brain,
combining features of hysteria and epilepsy.
The patients displayed a variety of symp-
toms, including convulsions, contortions,
and fainting. These demonstrations, held by
the great showman in the amphitheater of
the Salpetriere Hospital, were open to lay
public. The dramatic “hysteria shows”
Fig. 4.18: Hysteria show
aroused so much public attention that
hysteria became the most common neurological disorder in Paris. Charcot
and neurology became a house-hold name in France.
One of his pupils was Joseph Babinski, who himself later became a
famous neurologist. Babinski did not agree with the diagnosis of
hysteroepilepsy. The patients were usually young, and emotionally
troubled women. These patients suffered from social maladjustment
and came to the hospital with vague complaints of disease; a feeling of
demoralization or distress without any evident cause. Charcot would
diagnose them as patients of hysteroepilepsy and admit them to a ward
where other such patients as well as patients of epilepsy were housed.
Charcot’s interest in their personal problems, the encouragement of the
attendants and the example of others in the ward prompted the patients
to accept Charcot’s view of them and eventually to display the expected
symptoms. They began to imitate the epileptic attacks they repeatedly
witnessed. Babinski argued that the disease hysteroepilepsy was not
discovered but invented by Charcot. The patients suffered not from a
disease but from an idea. To prove his point, Babinski asked Charcot to
keep these patients isolated from other patients, and also from the staff
112 | History of Medicine

involved earlier in their treatment. Within weeks, the patients showed


complete recovery from all the symptoms displayed by them earlier.
Babinski also reminded Charcot about the epidemics of fainting,
convulsions and wild screaming in Convents and in girls’ boarding schools,
which ended only when the afflicted group was broken and scattered.
Finally, Charcot abandoned the idea of hysteroepilepsy and the “hysteria
shows” were no more held. In spite of the fiasco of hysteroepilepsy,
Charcot stands tall among the neurologists in the history of medicine.
Charcot was very hard-working. He worked uninterruptedly all
day, and the lamp on his desk could be seen burning at 2 O’clock at
night. His only leisurely pleasure was music. On Thursday evenings,
which were devoted entirely to music, nobody was allowed to utter a
single word on medicine. He loved animals and detested animal
experimentation. He had inscribed on his door, “You find no dog clinic
with me.”

Quotations of Jean-Martin Charcot


• Disease is very old, and nothing about it has changed. It is we who
change, as we learn to recognize what formerly was imperceptible.
• If the clinician, as an observer wishes to see things as they really are,
he must proceed without any preconceived notions whatever.
• Clinical medicine is made up of anomalies.
• In the last analysis, we see only what we are ready to see, what we
have been taught to see. We tend to eliminate and ignore everything
that is not a part of our prejudices.
• Common-place scepticism, which is so readily opposed to all
progress of human mind, is a convenient pillow for lazy heads.
• How is that, one fine morning, Duchenne discovered a disease which
probably existed in the time of Hippocrates?
19th Century Medicine | 113

• To learn how to treat a disease, one must learn how to recognize it.
The diagnosis is the best trump in the scheme of treatment.
• Symptoms are in reality nothing but the cry from the suffering
organs.

JOSEPH BABINSKI
Joseph Jules Francois Felix Babinski
(1857-1932) (Fig. 4.19) is one of the most
famous French neurologists. He is known
to every medical student because of the
clinical sign of pyramidal tract lesion first
described and named after him (Babinski’s
sign). In addition, he was responsible for
many other neurological discoveries,
especially the dermatomes.
As a young medical student, Babinski
came under the guidance of the famous
French neurologist, Jean-Martin Charcot Fig. 4.19: Joseph Jules
at the Salpetriere Hospital in Paris. Charcot Francois Felix Babinski
soon recognized his sharp clinical skill, and Babinski became his most
favorite student. At that time, Charcot was most famous for the diagnosis
of hysteroepilepsy; a new disease “discovered” by him. Charcot gave
public demonstrations of such cases, nick-named “hysteria shows” in
the newspapers. As a result, a large number of patients, chiefly young
women, were admitted to the wards of Salpetriere Hospital and treated
by Charcot. Babinski was the first to recognize the fallacy of the diagnosis.
He told Charcot that the disease “hysteroepilepsy” was not discovered
but invented by him. Finally, he was able to persuade Charcot to
acknowledge that the patients, usually women, were emotionally
disturbed and responded to idea of the disease planted in their mind by
114 | History of Medicine

Charcot. There were no more cases of hysteroepilepsy, and the “hysteria


shows” came to an end.
After Charcot’s death, Babinski was the most eligible candidate to
succeed him as professor of neurology. However, he was out-maneuvered
by Charles-Joseph Bouchard, one of the numerous pupils of Charcot.
As a result, Babinski left the Salpetriere Hospital and became head of a
neighboring hospital in Paris, Hospital de la Pitie.
Babinski’s failure to climb the academic ladder was to become a
matter of fundamental importance to French neurology. Since he was
not involved in teaching duties at the Pitie hospital, Babinski’s found
enough time to study in detail various neurological disorders. He was a
master clinician, who had greater faith in his clinical skill than on
neuropathological examinations or laboratory tests, conducted by most
of his contemporaries. In his clinical work, Babinski was an extremely
noncommunicative. During his clinical examination, he did not utter a
word, sometimes not even afterwards. His working manner was
characterized by exceptionally strong observational powers and thorough
discussion of the clinical signs.
When Babinski published a research paper, it was often commen-
dably short and concise. Such was the case when he presented, in 1896,
a 26 line paper on the phenomenon, later called the Babinski’s sign
(dorsal flexion of the big toe on scratching the outer border of the foot)
as an indicator of the pyramidal tract lesion. Babinski’s conclusion was
experimentally verified by Fulton, an experimental neuro-physiologist,
by production of pyramidal tract lesions in chimpanzees. Within years,
elicitation of this sign became an integral component of the neurological
examination of a patient.
Another very important contribution of Babinski was the discovery
of dermatomes—areas of skin innervated by sensory fibers from different
spinal segments. In the beginning of 20th century, many attempts were
made to surgically remove the tumors in the spinal canal, but most of the
19th Century Medicine | 115

time the laminectomy was performed at a wrong level-too low. In 1910,


Babinski demonstrated that careful examination of the sensory system
(dermatomes) led to the accurate diagnosis of level of the tumor, without
exception. With the help of his diagnostic method, surgery on the spinal
tumors became feasible.
In 1900, a year before Alfred Fröhlich, Babinski described the
Adiposogenital syndrome in a case of pituitary tumor, a condition still
known as the Babinski-Fröhlich syndrome. In 1905, Babinski described
the neurophysiologic background of tabes dorsalis (syphilitic lesion of
spinal cord). He also described in detail the clinical features of cerebellar
syndrome and introduced the terms ataxia and dysdiadochokinesia.
It is often noticed in the history of medicine that research workers
suffer from a disease in which they have done lot of research. No wonder,
Babinski, the great neurologist, suffered from Parkinson’s disease in the
later years of his life.

SIGMUND FREUD
Sigmund Schlomo Freud (1856-1939)
(Fig. 4.20) was an Austrian neurologist and
a founder of psychoanalytic school of
psychology.
Early in his medical career, Freud was
influenced by the “hysteria shows” of the
French neurologist, Jean-Martin Charcot. He
started the hypnotic treatment of hysteria
in his clinic in Austria. Soon, he realized the
futility of such a treatment and started what
he called “talk therapy.” He would ask the
patient to lie down on a couch and talk in Fig. 4.20: Sigmund
detail about his problems, likes, dislikes, his Schlomo Freud
116 | History of Medicine

childhood, his relationship with the parents, siblings and the society in
general. He found that such prolonged talk session often acted as a
mental catharsis and the patient felt relieved of his mental stress. In the
21st century, in spite of the availability of numerous psychotropic
drugs, talk therapy is still considered an important adjunct in the
treatment of psychologically ill patients.
Probably, the most important contribution of Freud is the concept
of unconscious mind. Before him, no one thought of the fact that our
conscious mind is just a tip of the mental iceberg. Many of our actions
are dictated by unconscious desires, which we do not want to bring to
our consciousness. People often struggle to keep the real motive out of
their consciousness. The conflict between the unconscious desires and
conscious realities can be an important cause of social maladjustment
leading to a variety of mental disorders. The hold of Freud on the subject
of psychology is such that any discussion on the subject invariably
leads to what Freud has to say on the topic. Not that every one agrees
with the views of Freud. Either you are his strong supporter or a bitter
critic, but his views cannot be ignored.
One of the views of Freud found distasteful are that all our actions
throughout life are motivated by sexual desire. He tried to project the
human development and aging to changes in the objects of sexual desire.
According to him, right from childhood, a boy has sexual desire for his
mother (Oedipus complex) or a daughter for his father (Electra complex),
which gradually weans off only because of the social taboos. Due to
such views, Freud is often called “a sex maniac, himself a psychic
patient who claimed to treat others.” Another forgettable work of Freud
was on dreams. He wrote books on the interpretation of dreams to
detect the patient’s unconscious mind. Most of the people consider it
just a humbug.
The cocain episode is one of the darkest phases of Freud’s early
career. Freud was an early user and proponent of the psychotropic drug,
19th Century Medicine | 117

cocaine. He wrote several articles on the antidepressant properties of


the drug. Later, Freud felt that cocain would work as a cure-all for many
disorders. He wrote a well-received paper “On Cocaine,” (1884),
explaining its numerous virtues. Freud prescribed cocaine not only to
his patients but also to his close friends and family members, some of
whom became cocaine addicts. One of his patients was prescribed 5%
solution of cocaine as a treatment of intestinal pain. After taking cocaine,
the patient complained of numbness of his tongue and lips. This gave
Freud a clue about the possible anesthetic property of cocaine. A friend
of Freud, Carl Koller, present at that time also heard the patient’s
complaint of numbness. Freud requested another friend Konigstein, an
eye doctor, to check on the extent to which the anesthetic properties of
cocaine can be used in patients with sore eye. At that stage, he left the
town to be with his fiancée. On his return two months later, Freud was
shocked to know that Koller had studied the effects of cocaine on the
eyes of animals and had already presented the results in an ophthal-
mological congress. Freud was angry with Koller not merely because he
stole the idea and got credit for the discovery, but also because that was
the only action of cocaine later found to be of clinical use.
Freud smoked cigars for most of his life. Even after his jaw was
removed due to malignancy, he continued to smoke till his death. He
smoked an entire box of cigars daily. He underwent 30 operations for the
malignancy. Ultimately, the pain was so unbearable that he requested
his physician to end his life with an overdose of morphine.

Quotes by Freud
• What progress we are making! In the Middle Ages they would have
burned me. Now they are content with burning my books.
• The ego is not master in its own house.
118 | History of Medicine

• The first human who hurled an insult in stead of a stone was the
founder of civilization.
• The goal of all life is death.
• The greatest question that has never been answered, and I have not
yet been able to answer, despite my thirty years of research into the
feminine soul is, “What does a woman want?”
• The tendency to aggression is an innate, independent, and instinctual
disposition in man—it constitutes the powerful obstacle to culture.
• Neurosis is the inability to tolerate ambiguity.
• One is very crazy when in love.
• Being entirely honest with oneself is a good exercise.
• America is a mistake, a giant mistake.
• America is the most grandiose experiment the world has seen. I am
afraid; it is not going to be a success.

CLAUDE BERNARD
Claude Bernard (1813-1878) (Fig. 4.21)
was France’s most famous physiologist,
especially known for his ideas on the
internal environment of the body. He was
born to a poor wine-grower’s family. He
had a little education in Latin from a village
priest. At the age of 18, he became an
assistant of a pharmacist. Soon he was
disillusioned by the mixtures he prepared
and sold. A particular mixture in great
demand was comprised of all sorts of
materials including even spoiled drugs of Fig. 4.21: Claude Bernard
the shop. In disgust, he resigned the job and started writing plays for the
stage. A critic, who read his work, advised him to “leave play-writing to
19th Century Medicine | 119

more talented individuals and take up medicine, an easier profession


well-suited to his poor mental caliber.” Claude Bernard took the advice
seriously and became a student of the famous French physiologist,
François Magendie. In 1843; he obtained the medical degree, getting
26th rank in a class of 29. He joined Magendie as his research assistant.
However, he could not pass the examination that would have qualified
him for a teaching post. In disgust, he tried to run a private course in
experimental physiology, but was not successful. Due to financial
difficulties, Bernard was about to take up a position as a country
physician. However, one of his friends advised him to go for a marriage
of convenience. Bernard married the daughter of a rich doctor of Paris.
The dowry of 600,000 francs helped to set up an independent research
laboratory. In 1847, he was appointed deputy-professor in the
department of Magendie and started a series of experiments that led to
important discoveries mentioned below. In 1855 he succeeded Magendie
as full professor in the college of medicine.
The first important discovery made by Claude Bernard was in the
role of the pancreas in digestion of fat. The discovery was based on an
observation on the dining table.While trying to clean a rabbit before
cooking, Bernard noticed that the lacteals were filled with a white fluid
(absorbed fat) only up to some distance from the pylorus, in contrast to
dogs in which the lacteals were full of absorbed fat right up to the
pylorus. He correlated with the difference in the entry point of the
pancreatic duct in the intestine in the rabbit and dog. He felt that pancreatic
juice was some how important in the digestion of fats in the intestine.
This fact was confirmed when he observed splitting of neutral fats when
incubated with crushed pancreatic tissue. Moreover, he demonstrated
that obstruction of pancreatic duct in a dog resulted in loss of large
amount of fat in the stools of the animal. This discovery was made in
1846, and earned him a prize (the Grand Prix, 1849) in experimental
physiology from the French Academy of Sciences.
120 | History of Medicine

The next important discovery was the role of the liver in glucose
metabolism. He discovered the presence of glycogen in the liver which
acts as a store house of glucose, and from where glucose can be released
for use by rest of the body. He also showed that the plasma glucose level
rose, even in normal individuals, with the ingestion of food and did not
necessarily indicate the presence of diabetes. For this work, Bernard
received Grand Prix, 1851. Another important work was the discovery
of the vasoconstrictor and vasodilator autonomic nerves.
In about 1851, Bernard was engaged in the study of the effects of
the resection of nerves on the temperature of the parts of the body
supplied by them. He observed that the resection of cervical sympathetic
nerves resulted in greater blood flow and pulsations in arteries of that
side of the head. The effect of stimulation of upper cut ends of the
nerves produced just the opposite. As can be expected, a Grand Prix,
1853 followed.
Due to working long hours in the damp laboratory situated in a
basement, his health began to deteriorate and by 1860, Bernard stopped
active research work. However, the zeal in research was not yet over. In
1865; he published a book, An Introduction to the Study of Experimental
Medicine. It was a masterpiece which continued to be read by generations
of research workers. Bernard was a strong believer in the necessity of
formulating a working hypothesis, derived from perusal of the literature
or observation of a natural phenomenon, before starting on the experiment
proper. This was in sharp contrast to the working style of his immediate
superior, François Magendie’s approach, who believed in “let us do this
procedure and see what happens.” Claude Bernard used to say: “He
who does not know what he is looking for will not lay hand on what he
has found when he gets it.” Bernard was a strong advocate of animal
experimentation for the progress of physiology and medicine. This fact
is significant in view of the strong antivivisection agitations sweeping
England and France at that time. (It is interesting to note that the wife of
19th Century Medicine | 121

Claude Bernard was a very vocal supporter of the antivivisection lobby


in France and this was one of reasons for their divorce).
Probably the most important contribution of Claude Bernard was
the concept of milieu interior (internal environment). Conditions in the
world around us constantly change but the delicate balance of the internal
characteristics of our body tissues is not affected. It is achieved through
a series of physiological mechanisms, called homeostasis, in the fluids
circulating in the body.
Besides the three Grand Prix awards given by the French Academy
of Sciences, Claude Bernard was honored in his life time with numerous
other titles. In middle of 1870s Bernard developed disorders of the liver
and the pancreas (the organs on which he had researched extensively!)
and died in 1878.

Quotations by Bernard
• I consider the hospital the antechamber of medicine; it is the place
where the physician makes his observations. But the laboratory is
the temple of the science of medicine.
• A scientific hypothesis is merely a scientific idea, pre-conceived or
provisional. A theory is merely a scientific idea controlled by
experiment.
• It is immoral to make an experiment on man when it is dangerous to
him, even though the results may be useful to others.
• Experiment is fundamentally only an induced observation.
• Medicine includes real experiments which are spon-taneous, and
not induced by physicians.
• A great discovery is a fact whose appearance in science gives rise to
shining ideas, whose light dispels many obscurities and shows new
paths.
122 | History of Medicine

BROWN-SÉQUARD
Charles-Edouard Brown-Séquard
(1817-1894) (Fig. 4.22) was a French
physiologist, famous for his pioneering
work on the physiology and pathology
of the spinal cord. He was born in
Mauritius to an American father and a
Mauritian mother of French origin. At
birth he was named Charles-Edouard
Brown. Since his father died before his
birth, he was raised by his mother, named
Séquard, in great financial difficulties. In
honor of his mother, he added her maiden Fig. 4.22: Charles-Edouard
name to his surname and called himself Brown-Séquard
Brown-Séquard. More probably, he attached the French surname in
order to gain acceptance in the French medical circles but the change of
name did not help.
Brown-Séquard graduated in medicine in Paris, in 1846, and started
his researches in the physiology of the spinal cord. In spite of his
extraordinary expertise in experimental physiology, he never got a good
academic appointment for the next 30 years. In this period, he worked
in France, USA, England and Mauritius, never more than a couple of
years at one place. The trouble was his origin. Because of the chauvinistic
attitude prevalent at that time; none of the countries where he worked
really accepted him as its own research worker.
In the USA, in 1854, Brown-Séquard was appointed as professor
of Physiology in the Medical College of Virginia. The appointment of a
foreigner to the medical college became a topic for the editorials in local
medical journals. Due to oversight, he was not allotted a laboratory with
animals. Consequently, he and his students used to search the city for
19th Century Medicine | 123

suitable animals such as cats, dogs, raccoons, etc. In those days, American
doctors did not believe in experimental animal research. The local
population was horrified at the strange sounds of animals being
experimented upon without anesthesia in the medical college. Within
one year, Brown-Séquard was asked to leave the job.
Brown-Séquard did not mind experimentation even on himself. Once
he painted his entire body with waterproof varnish to “discover the
facts about human sweat.” Some hours later, he was barely rescued by
a medical student, “when he was beginning to die.” He also used to
swallow a sponge tied to a string so to collect the gastric juice.
Brown-Séquard’s major contribution was a book, “Experimental
and Clinical researches on the Physiology and Pathology of Spinal Cord,”
published by him in 1855. In this book, he demonstrated the decussation
of fibers carrying pain and temperature sensations in the spinal cord
itself. His name was immortalized in the history of medicine by his
description of a syndrome caused by hemisection of the spinal cord,
(The Brown-Séquard syndrome). For his extensive experimental
researches, Brown-Séquard received five prizes from the French Academy
of Sciences, and he was awarded research grant twice by the British
Royal Society. After the death of Claude Bernard, in 1878, Brown-
Séquard was offered, for the first time, an important academic appoint-
ment in France. He became a professor of Experimental Medicine in
College de France, Paris. He remained there till his death in 1894.
In 1889, at the age of 72 years, Brown-Séquard reported in the
Lancet that he had rejuvenated himself by injecting testicular extract of
dogs and guinea pigs. According to him, “A radical change has taken
place in me. I have gained at least all the strength I possessed a good
many years ago—My limbs tested with a dynameter showed a decided
gain of strength— I had a greater improvement with regard to the expulsion
of fecal matter than in any other function—With regard to the facility of
intellectual labor, which had diminished within the last few years, a
124 | History of Medicine

return to my previous condition became quite manifest.” The report


was widely publicized and was largely responsible for the rise of
“organotherapy,”a mode of treatment that became widespread in Europe
and North America. Extracts from animal testes, adrenal, pituitary, as
well as nonendocrine tissues such as liver, spleen, and spinal cord were
injected to treat a variety of disorders and to counteract the aging process.
However, soon the medical profession became aware of the foolishness
of the claim, and organotherapy was abandoned for good. Thus, after a
long and fruitful research career, Brown-Séquard had an ignominious
end.

CHARLES DARWIN
Charles Darwin (1809-1882) (Fig. 4.23)
was a British naturalist who presented the
most controversial theory of his times,
the theory of evolution of mankind.
Darwin was not at all good in studies.
Consequently, following his graduation
from a divinity school, he failed to secure
an employment. However he was able to
secure a position as ship’s naturalist aboard
the HMS Beagle. The Beagle took a five
years’ cruise around the world, from 1831
to 1836. During the course of the voyage, Fig. 4.23: Charles Darwin
Darwin noticed how the species changed
along the coast of South America following the changes in the
environmental conditions. On return to England, Darwin studied in
detail all the specimens collected by him during the voyage.
In 1838, Darwin had his first sight of an ape. Its antics impressed
him as being “just like a naughty child.” He was reminded of the aboriginals
19th Century Medicine | 125

he had seen in South America. He felt that there was little gulf between
man and animals despite theological doctrines that only mankind
possessed soul.
On his return from the Beagle’s voyage, Darwin was contemplating
marriage. In one of his notebooks found after his death, it was found that
he tried to decide the issue by listing pros and cons of marriage on a
sheet of paper. Under the column headed “Marry” were written many
advantages including “constant companionship and a friend in old
age….better than a dog anyhow.” Under the column “Not Marry” were
listed many disadvantages which included “less money for books” and
“terrible loss of time.” In any case he found the first column lengthier
than the second and married one of his cousins.
In 1858, he came out with a book “On the Origin of Species by
Means of Natural Selection.” The main theme of the theory of natural
selection was that all animals produce more offspring than can be fed by
the available natural resources. Consequently there is a struggle for
survival. In sexually reproducing species, no two individuals are identical,
that is, genetic variations occur naturally and are inheritable. Individuals
less suited to the environment are less likely to survive. Individuals
more suited to the environment are likely to survive and transmit the
superior trait to the next generation. Over generations, new species are
thus created.
Publication of the book created a severe religious backlash. Darwin
was wrongly accused of preaching the human beings are direct
descendents of apes. Most of the religions, especially the Roman
Catholics believe in Divine creation of man. According to them the
evolutionary theory degrades human beings, by placing them at the
same level as animals. What Darwin meant was the apes and man share
a common ancestor during the evolution over millions of years.
Thomas Huxley, a contemporary of Darwin, was one of the most
vocal supporters of Darwin’s theory of evolution. Since Darwin was in
poor health, Huxley was his “bulldog” in most of the public debates on
126 | History of Medicine

Fig. 4.24: Man’s place in nature

the subject. In one of such debates, an opponent of the evolutionary


theory asked Huxley: “Have you descended from monkeys on your
grandfather’s side or your grandmother’s side?” Huxley replied: “I would
rather be a descendent of an ape than of a cultivated man who used his
gifts of culture and eloquence in the service of prejudice and falsehood.”
Huxley, a physician by training, was a great intellectual of his times. He
published his own book in support of the theory of evolution: “Evidence
as to Man’s Place in Nature.” The illustration on the title page of the
book compares the skeletons of various types of apes to that of a
human (Fig. 4.24).

HEALTH CARE IN 19TH CENTURY LONDON

Sanitation
In the Victorian era, London was the largest, and the most spectacular
city in the world. Conditions of the city in the 1850s are well-documented.
19th Century Medicine | 127

Perusal of these documents highlights the extremely poor sanitary


conditions at that time:
“Personal cleanliness is not a big priority, nor is washing the clothes.
In close crowded rooms, the smell of unwashed bodies is stifling. It is
unbearably hot near a fire-place and numbingly cold away from it. The
homes of the rich exist in close proximity to the area of the unbelievable
squalor and filth, in which the poor lived. Streets are full of tons of dung
coming from the thousands of horse-drawn carriages. Cattles are also
driven through the streets. Hundreds of oxen and sheeps are slaughtered
every day; their blood is allowed to spill over to the streets. People can
be seen walking in ankle-deep mud mixed with animal blood. Sometimes
straw is scattered on the streets so soak up the wet.”

Physicians/Doctors
In that era, the physicians occupied the highest
rung of the social ladder (Fig. 4.25). Such
citizens were considered gentlemen; unlike
surgeons, their training did not include
apprenticeship and the profession excluded the,
supposedly, manual labor. A physician was
asked to dine with the family during home
visits, while a surgeon dined with the servants.
A physician’s fee was wrapped and placed
nearby, because, theoretically gentlemen did
not accept money for their work. Their prestige
originated in their education; more often a degree
from a prestigious university such as Oxford,
Cambridge, or Edinburgh. A medical degree, Fig. 4.25: Early 19th
however, did not require any clinical experience. century’s physician
The medical colleges required at least four years of training to graduates.
These years had to be filled in somehow, and they were “apt to be spent
128 | History of Medicine

in idleness and sensual gratification; medical students had an unenviable


reputation for drunkenness and debauchery.” To graduate, students often
employed a “grinder,” someone who prepared the student by teaching
him the questions and answers for the examination by rote. Favoritism
and nepotism also helped a student to graduate. These practices were
common as late as 1870. Even after graduation, a physician seldom
examined the patient. Medicine was prescribed after merely listening to
the complaints. It must be a miracle of God if any seriously sick patient
got cured with such a treatment. When a 19th century doctor declared:
“I prescribe, He cures”, what he really meant was “I charge; He cures.”

Surgeons
The most important difference between a
physician and a surgeon in the 19th century
was that the surgeon usually had no formal
education (Fig. 4.26). He was apprenticed
in the mid-teens to a practicing surgeon.
The surgeon was paid by the parents of the
boy to watch him in his work, read his
books, and later act as his assistant. Such
pupils were notorious for their rowdiness.
Surgeons stitched wounds, set bones, pull
teeth, etc. Bloodletting, a common form of
treatment for innumerable ailments was also
assigned to the surgeons. Surgeons were not
allowed to be known as doctors. A surgeon
was addressed as plain mister in contrast to Fig. 4.26: Early 19th
a physician who was addressed as doctor century surgeon
so-and-so. By the end of 19th century, the status of surgeons improved
to a great extent. By that time, anesthesia became available for painless
19th Century Medicine | 129

surgery and antiseptic techniques allowed the types of abdominal


operations unthinkable few decades earlier.

Hospitals (Figs 4.27 and 4.28)


Till 19th century, wounds were seldom known to heal in a clean straight
forward way. Suppuration of the wound was almost universal. In medical
schools; it was taught that pus formation was one of the essential
processes of healing. Surgeons talked of “laudable pus.” The conditions
of the hospitals in England and Europe were so bad that they were
labeled as “sinks of human life”. The hospital wards were overcrowded:
often as many as six patients occupied one bed. Inadequate water supply
and filthiness of the ward was a rule, rather than an exception. The work
style of the surgeons was such that they themselves were the source of
infection in their patients. Erysipelas, pyemia, septicemia, and gangrene
were known as “hospital diseases,” since they usually developed after
the patient was hospitalized. These complications killed surgical cases
like flies.

Fig. 4.27: 19th century Fig. 4.28: Modern


operation theater operation room
130 | History of Medicine

Nineteenth century surgeons did not sterilize or even wash their


instruments. They kept on using same probe or knife in patient after
patient. The surgeons did not wash their frock coats used in the operation
theater for months. These were often crusted with blood and pus. In
case of some surgeons, the amount of blood and pus dried on their coats
gave a sense of superiority as a surgeon. The operation theater was
often full of fellow colleagues or even laymen clothed in the usual outdoor
dress. For the surgeon the surgical operation was no less than a theatrical
performance. Probably, that is why, the operation room was called
operation theater. Under these circumstances, one might wonder how
any one survived a surgical operation.

HISTORICAL IMAGE OF WOMEN AS PATIENTS


The historical view of women as patients is interesting. Ever since
ancient times, medical writers have implied that female gender itself is
an abnormality, i.e. inherently pathological. A
more charitable view was that women are weaker
and “wetter” version of men, and hence more
prone to sickness.
In the medieval period, even up to 17th
century, some old women were labeled as
witches since they were believed to be
possessed by Satan. The witches were
supposed to have wicked and diabolic powers.
They were arrested, tried and usually put to
death. By 18th century, it came to be realized
that witches were nothing but mentally sick
women. The problem was with their prosecutors
who were commonly ignorant and superstitious Fig. 4.29: Treating
peasants (Fig. 4.29). a “witch”
19th Century Medicine | 131

In the Victorian era, any woman with “excessive” sexual appetite


was diagnosed as suffering from “nymphomania” (though, no one could
define the ‘normal’ sexual appetite). Nymphomania was some times
treated by surgical removal of ovaries, uterus or clitoris. Even in the 21st
century, female genital mutilation is highly prevalent in some of the
African countries.The procedure varies from surgical removal of clitoris
to extensive resection of external genitalia and near closure of the vaginal
opening. The custom is said to “ensure that the young girl remains a
virgin or a good marriage material.”
A study of complaints of 2000 women patients in the 18th century
revealed the following were most common— dull headache, fading eye
sight, fading hearing, heavy tongue, a rising of blood towards the breast,
choking of the breast, heart anxiety, painful womb, colic in the womb,
wind turning upwards, wind moving downward or moving wind, etc.
Such complaints, would be as baffling to the top physicians today as
were 200 years ago. These symptoms were seen mostly in the women
of affluent families who had nothing to do whole day. Love-sick woman
was a fairly common diagnosis (Fig. 4.30). Such neurotic patients
provided bread and butter to majority of the physicians of those times.

Fig. 4.30: Love-sick woman


132 | History of Medicine

Nineteenth century physicians regarded tuberculosis primarily a


disease of women. Simply, being a female was considered a favorable
condition to the development of tuberculosis. Romantic fashion
demanded ladies to be slim and delicate. It was insinuated that young
females or their mothers even courted tuberculosis to become slim enough
to snare a husband! According to another point of view, shared by
Laennec, the famous French physician, syphilis and tuberculosis, the
main diseases of the prostitutes of Paris of those days, were a result of
“venereal excesses.” (So much for the clinical acumen of the famous
physician).
Throughout the history of mankind, menstruation has been regarded
as a shameful, unclean or unhealthy act. A Roman historian has given the
following description of the menstrual blood: “Contact with it turns
new wine sour, crops touched by it become barren, grafts die, seeds in
gardens are dried up, the fruit of tree fall off, the edge of steel and the
gleam of ivory are dulled, hives of bees die, even bronze and iron are at
once seized by rust, and a horrible smell fills the air; to taste it drives
dogs mad.”
In the Middle Ages, it was thought that menstruation was the result
of the inferior way in which the women are put together. Or women bled
because they were cursed by God. Till recently, the strange ideas about
menstruation continued to be propagated. As late as 1960s, some medical
guide books suggested that women should not take bath or exercise
during their periods.

MIASMA THEORY OF DISEASE


In the ancient times, the cause of diseases was said to be a punishment
from God. Hippocrates and his followers attributed the disease an effect
of the disturbance of the four body humors. Miasma theory of disease
began to be popular in the Middle Ages. It remained popular till near the
19th Century Medicine | 133

end of nineteenth century when Pasteur and Koch proved the germ
theory of disease.
Miasma is a Greek word meaning pollution or a noxious form of
“bad air”. It was supposed to be a sort of poisonous vapor or mist
originating from decomposing human or animal waste. It was believed to
be the cause of various diseases like cholera or plague. Miasma was
identified by the foul smell.
For example, the cholera
epidemic of London, in
1851, was attributed to
miasma originating from
the River Thames (Fig.
4.31). Actually most of the
sewers of the city opened
into the river, and the
drinking water was drawn
from it. Because of the
Miasma theory of disease, Fig. 4.31: Miasma theory of disease
some sanitary measures
were taken. The night-soil began to be removed from human proximity.
Sewers began to be constructed and improved to prevent the foul smell.
Garbage piles and sewage was tended to in order to make the city smell
better. Thus in the Victorian era, people were inadvertently removing
bacteria, the true cause of disease. However in some cases, removing the
sewage only complicated the situation further. Often the sewage was
flushed directly into local water supply!
Moreover, the supposed relation between foul smell and disease led
to gross negligence in the hospitals. A physician would attend to patients,
one after another, including septic cases, without ever washing his
hand.Why? Because there was no foul smell; therefore there was no
risk. Because infection was supposed to be carried by “bad air,” the
134 | History of Medicine

physician could never imagine that he was transferring disease through


his finger nails. During the great plague of 1465, doctors used to visit
patients wearing masks filled with sweet smelling flowers to ward off
the evil air. Notable supporters of miasma theory include Dr William
Farr of London (1850s) and Florence Nightingale (1820-1910), the nurse.
She was particularly instrumental in starting the movement to make the
hospitals clean, and airy.
In those days, even the
night air was supposed to
be “bad air” (Fig. 4.32).
People used to sleep with
doors and windows of their
bedrooms tightly shut to
keep out the “bad night air.”
No wonder, respiratory
diseases like bronchitis and
tuberculosis were highly Fig. 4.32: Miasma theory of disease
prevalent.
Thus, although the miasma theory was ultimately proved wrong, it
was not an unmitigated disaster as it seems. It did help to propose a
connection between dirtiness and disease. It made the public aware that
diseases like cholera became epidemics in places where water was
undrained or foul-smelling. This theory led to improvements in sanitary
system in the towns and in hospital environment. The miasma theory
helped to attract scientists towards putrefaction (organic decay) which
ultimately led to the germ theory.

IGNAZ SEMMELWEIS
Ignaz Semmelweis (1818-1865) (Fig. 4.33) was the Hungarian-
Austrian physician who demonstrated that the puerperal fever (“childbed
19th Century Medicine | 135

fever”) was contagious and that its


incidence can be drastically reduced by
use of proper hygienic methods by the
medical staff attending on the patients.
Semmelweis graduated from a
medical school in Vienna. After some
further surgical training, he became a
senior house officer in the obstetrical
department of Vienna General Hospital
in 1846. The ward in which Semmelweis
worked had a maternal mortality rate of
13 percent due to puerperal fever.
Another maternity ward in the same
Fig. 4.33: Ignaz Semmelweis
hospital had maternal mortality rate of
only 2 percent, even though both wards were located in the same hospital
and used similar techniques. The only difference was that the first ward
was used for training of doctors whereas the second ward was used for
training of midwives. Doctors and medical students visited only the
first ward. Being a teaching hospital, doctors and medical students were
encouraged to attend postmortems and do dissection of cadavers as
often as possible. Consequently, the doctors and students would often
attend maternity cases just after a postmortem or a dissection.
Semmelweis thought that some “cadaveric material”was being transferred
by the medical staff which caused childbed fever. (The germ theory had
not originated by that time). He instituted a policy of using a solution of
chlorinated lime for washing hands between autopsy work and attending
on the maternity cases. Soon the mortality fell to 2.38 percent; almost
similar to the second maternity ward. There was total resistance to his
idea. No one was willing to repeatedly wash hand before seeing each
pregnant case, since it involved “too much work.” Moreover, doctors
136 | History of Medicine

were not willing to admit that they themselves caused the death of so
many patients every month.
The breakthrough came in 1847 with the death of his colleague and
friend Dr Jakob from an infection contracted after his finger was
accidentally injured during an autopsy. On postmortem, Semmelweis
found same pathological picture as seen in women dying of puerperal
fever. This observation further strengthened the view about “cadaver
material” as a cause of puerperal fever. So, Semmelweis further widened
the scope of washing protocol to include all instruments coming in
contact with patients in labor. In reaction, Semmelweis was dismissed
from the job in 1849. Consequently, he move to Hungary and became
head of maternity ward in a hospital in 1851. Here, his protocol of
washing hands and instruments brought neonatal mortality rate below 1
percent. His methods found more acceptances throughout Hungary. In
1861 he published his experience on puerperal sepsis as a book, “Etiology,
Concept and Prophylaxis of childbed Fever.” The book received most
unfavorable reviews. The world had to wait for many decades before
Semmelweis’s discovery could be appreciated. In spite of pioneering
work on puerperal sepsis, Semmelweis received no award during life or
posthumously. By early 1860s, Semmelweis began to show signs of
dementia. He was admitted to a mental asylum, where he died in 1865,
because of severe beating by the hospital staff.

LOUIS PASTEUR
Louis Pasteur (1822-1895) (Fig. 4.34) was a French chemist. He is
remembered for the discovery that most infectious diseases are caused
by microbes (germs) present in the air. This concept, known as the
“Germ Theory of Disease”, was a revolutionary theory in those days.
Pasteur thereby laid the foundation of the science of microbiology.
19th Century Medicine | 137

Initially, Pasteur, being a chemist,


was interested in crystallography and
made pioneering discoveries in stereo-
chemistry. In recognition of this work,
Pasteur was appointed Dean of the
College of Sciences at the University
of Lillie (France) in 1843. At that time,
the city of Lille was the center of alcohol
manufacture. One of the factories had a
recurring problem that some of the vats
of fermented beer turned sour and had
to be thrown away, causing huge losses.
They approached Pasteur for a Fig. 4.34: Louis Pasteur
solution. Using a microscope, Pasteur
found a huge number of microbes in the soured beer whereas “good
beer” did not show any. This led Pasteur to conclude that the bacteria
were responsible for the beer getting sour. Pasteur studied the effect of
prolonged exposure of other liquids like milk, wine and vinegar. Microbes
could be seen under a microscope in all these liquids, but did not develop
if the liquid had initially been heated to 55 degree celsius. This process
subsequently became to be known as “pasteurization” and is still widely
used to preserve milk or beer.
Louis Pasteur continued work on the germ theory for many years.
He exposed boiled broths (liquid growth medium) to air in vessels that
contained a filter to prevent all particles from passing in the growth
medium or in a flask ending in a long tortuous tube that would not allow
the contaminants to pass. Nothing grew in the broths. Many such
experiments followed. Ultimately in 1865 he presented his data to the
Academy of Sciences of the University of Paris. He could prove that:
• Air contained living organisms.
• These organisms can produce putrefaction.
138 | History of Medicine

• These microbes could be killed by heating the liquid containing the


microbes.
Initially there was great opposition to his idea of germs causing
human diseases. It had been accepted as a fact that putrefaction came
from within the body. No one was willing to look into the microscope.
Only after a long and bitter battle, he could convince only some of the
scientists present. Subsequently, Pasteur discovered three bacteria
responsible for many human illnesses—streptococcus, staphylococcus
and pneumococcus.
Since Pasteur was not a medical scientist, he could not pursue the
medical problems alone. Therefore, he invited two young and brilliant
doctors, Emile Roux and Charles Chamberland, to join his research
projects. The first project the team took up was on chicken cholera. In
the summer of 1880, Pasteur instructed Chamberland to inoculate the
chicken with a culture of the chicken cholera bacteria and went on a
holiday. Chamberland forgot to do the same and went on holiday himself.
On his return, Chamberland injected the month old cultures into the
chicken. To his surprise the chicken did not die of the disease as expected.
Even when the chickens were reinjected with fresh cultures, they
survived. Pasteur concluded that the injection of one month old bacteria
had led to development of immunity against the infection just like Jenner
had shown against smallpox. This led him to develop vaccine against
anthrax which infected the cattle. However there was so much doubt
about the vaccine that he was made to make a public show of the
efficacy of anthrax vaccine in 1882.
Next, Pasteur and his team went on to produce a vaccine for rabies
by using dried spinal cords of the infected animals. Just after thirteen
experiments on the dog, Pasteur got an opportunity of testing the vaccine
on humans in 1885. A young boy, Joseph Meister, bitten by a rabid dog,
was brought to Pasteur. Pasteur was not a qualified physician; therefore,
he was hesitant to administer the rabies vaccine to the boy. However,
19th Century Medicine | 139

since death was sure, Pasteur took the bold step of injecting the boy
with the vaccine. The boy survived, and therefore the legal matter was
not pursued by the police. A few months later, a second victim turned
up. He was a young shepherd also bitten by a mad dog. Following
reports of his successful treatment, the wild acclaim for Pasteur knew
no bounds. Victims of dog and wolf bites not only from all over France,
but also other European countries poured into his laboratory for
treatment. The newspapers and public followed these treatments and
cures with intense interest. Pasteur became a hero and a legend. Soon the
first Pasteur Institute was established in Paris to prepare vaccine against
rabies on a large scale. Pasteur went on to dedicate the remaining seven
years of his life to the institute. During this period Pasteur was honored
with many decorations in many parts of the world. Pasteur died in 1895
from a cerebral stroke. His remains were transferred to a permanent
crypt in the Pasteur Institute, Paris.
Joseph Meister, the first person to receive the rabies vaccine,
returned to the Pasteur Institute as an employee, where he served as a
Gatekeeper. In 1940, 55 years after his treatment for rabies that made
medical history, he was ordered by the German occupiers of Paris to
open Pasteur’s crypt. Rather than defile the remains of his savior, Joseph
Meister committed suicide!

Quotes of Pasteur
• In the field of observation, fortune favors only the prepared mind.
• Do not put forward anything that you cannot prove by
experimentation.
• Let me tell you the secret that has led me to my goal. My strength
lies solely in my tenacity.
• I beseech you to take interest in these sacred domains so expressively
called laboratories. Ask that there be more and that they be adorned,
for these are the temples of the future health and well-being.
140 | History of Medicine

SIR JOSEPH LISTER


Sir Joseph Lister (1827-1912) (Fig.
4.35), a British surgeon, is one of the
most important names in the history of
surgery. He is considered the Father of
Antiseptic Surgery.
Lister studied medicine in the
University College, London. In 1860,
he became Professor of Surgery at the
University of Glasgow. In the Royal
infirmary, under his charge, he was
horrified to see conditions of the surgical
patients. In the industrial city of Fig. 4.35: Sir Joseph Lister
Glasgow, accident rate was high and
amputation was a common surgical procedure. Lister was saddened by
the high mortality rate among the amputees. In those times, the amputees
had less than 60 percent survival rate. The patients tolerated the trauma
of surgery but later died of postoperative infection, called the “ward
fever”. The usual explanation for wound infection was that the exposed
tissues were damaged by miasma in the air. Actually, the surgical wards
usually smelled bad, because of rotting of the wounds, which was taken
as an evidence of miasma. The hospital wards were only occasionally
aired at midday, otherwise all the doors and windows were kept closed.
Facilities for washing of the surgeon’s hands or the patient’s wounds
were nonexistent and even considered unnecessary.
In 1865, Lister came across the book written by Pasteur, in which
the author had convincingly demolished the prevalent view of
spontaneous generation of pathogens in the wounds or miasma as
possible causes of putrefaction in the wounds. Lister was also familiar
with the book on puerperal sepsis written by Semmelweis. Lister was
convinced that wounds were infected by pathogens from the air. He
19th Century Medicine | 141

devised various methods to prevent infections of the wounds. Pasteur


had suggested three methods to kill the bacteria—filter them out or kill
them by boiling or by chemicals. He chose the last method, because it
seemed most feasible. He had seen the deodorizing effect of carbolic acid
on the sewage. In his early experiments he would coat the wounds with
a good amount of carbolic acid which formed an antiseptic crust of
coagulated blood on the wound. The wounds did not become infected
but the tissue damage caused by carbolic acid delayed healing. This led
him to experiment with different types of dressings which reduced the
contact of carbolic acid with the tissues to the minimum. The dressing
consisted of plaster mixed with carbolic acid, spread on calico and coated
with gutta-percha dissolved in benzene. Thus, the carbolic acid prevented
the bacteria from entering the dressing and gutta-percha prevented the
carbolic acid from entering the wound. Lister would wash his hands in 5
percent carbolic acid solution before and after surgery. He also started
sterilizing the surgical instruments by carbolic acid. To disinfect the air
around the patient during surgery, he started carbolic acid spray in the
operation theater (Fig. 4.36). He devised a special machine for the spray

Fig. 4.36: Carbolic acid spray


142 | History of Medicine

of carbolic acid. However carbolic acid spray was soon abandoned because
of side effects of inhalation of carbolic acid on the surgeon as well the
patient. As a result of these antiseptic measures, postoperative infection
in his surgical cases was drastically reduced. The postoperative death
rate in his cases fell from 45 to 15 percent. However, most of his
contemporaries laughed at his methods. Lister was said to have never
bothered to reply and heaved an occasional sigh at their stupidity.
Lister also devised a new method of repairing fractured knee caps
with metal wires. In addition he devised cat-gut sutures as well as use of
drainage tube in the wounds. The antiseptic measures advocated by
Lister gradually came to be used worldwide and Lister was considered
the most famous surgeon of England. Even now, he is remembered as
one of the most important surgeons in the history of medicine. In view
of his achievements, Lister was made Baron Lister of Lyme Regis and
became one of the twelve original members of the Order of Merit.
Use of surgical gloves during surgery was an indirect fall out of
Listarism. William Stewart Halsted (1852–1922) was a famous
American Surgeon. He is known as the Father of American Surgery. In
1880s, he was a strong believer in antiseptic surgical techniques. He was
in love with the head nurse of his operation theater. Unfortunately, the
nurse Caroline Hampton developed allergic dermatitis due to the use of
antiseptic agents during surgery. Halsted could not bear to see his lady
love suffer. He requested the Good Year Rubber Company to prepare
thin rubber gloves for her use. The surgical gloves, first used in 1890,
were primarily meant for protection of the staff, not the patient. The
use of rubber gloves, however, not only cured the nurse, but also, led to
a significant decrease in postoperative infection. Therefore Halsted
ordered its use by all the surgical teams of that hospital. Gradually the
use of rubber gloves by all members of the surgical team in the operation
theater became a standard practice all over the world.
19th Century Medicine | 143

Aseptic Era. The jump from antiseptic to aseptic surgical techniques


was quick and painless. Although effective, carbolic acid was unpopular
because of its toxic effects on the skin and lungs of the surgical team.
Around 1880, Dr Lawson Tait, a Scottish gynecologist became a strong
opponent of Listarism. He did not believe in the “germ theory” but was
acutely aware of the “ward fever,” the complications that led to a high
postoperative mortality. Tait insisted on absolute cleanliness of the
operation theater instruments. Instead of application of Lister’s carbolic
acid to the surface of the skin at the site of operation, Tait advocated
thorough cleaning with soap and water. Instead of soaking hands in
carbolic acid advocated by Lister, Tait washed his hands with soap and
water and scrubbed his nails with a brush. Lister merely removed his
jacket and pinned an unsterilized towel to his waist. Tait wore a large,
thoroughly cleaned mackintosh. The nurse assisting him in the operation
was instructed to come in newly washed clothes. Tait used clean washed
instruments rather than spraying the instruments and the operated area
with carbolic acid. With such innovative measures, Tait was able to
demonstrate as much reduction in postoperative mortality as with
Lister’s technique. Thus, since most of the surgeons hated the use of
carbolic acid, Tait’s method became more popular. This problem was
fully resolved when the German surgeon, Ernst Bergmann started heat
sterilization of all the clothes and instruments used in the operation
theater. With the onset of aseptic era in 1890s, surgeons could get rid of
postoperative infections.

ROBERT KOCH
Robert Koch (1843-1910) (Fig. 4.37) was a German physician who is
considered the Founder of the Science of Bacteriology. He is credited
with the discovery of tubercle bacillus (1882) and the cholera bacillus
144 | History of Medicine

(1883) and for the development of


Koch’s postulates. He was awarded
Nobel Prize in Medicine, 1905.
Koch, son of a mining engineer,
astounded his parents when, at the age
of five, he told them that he had taught
himself to read with the help of
newspapers. This feat foreshadowed the
intelligence and methodical way of
working in his later life. In the university,
Koch initially studied mathematics and
natural sciences and later studied
medicine. One of his teachers in medi-
cine was the well-known anatomist and
histologist, Friedrich Gustav Jacob Fig. 4.37: Robert Koch
Henle, who was a strong believer in the germ theory of human diseases.
In 1872 Koch became District Medical Officer of Wollstein, a rural area
near Berlin. Here, in a small laboratory, a part of his four roomed flat, he
started the pioneering work which transformed the methods of laboratory
investigations in medicine.
The first disease Koch investigated was anthrax. Anthrax, a disease
that killed herds of farm animals, was prevalent in the Wollstein district.
Anthrax bacillus had been discovered earlier but no one could prove that
it was the cause of the disease. Koch inoculated mice with the material
taken from the spleen of farm animals that had died of anthrax. All the
mice died whereas the mice injected with material from the spleen of
healthy farm animals did not suffer from the disease. Then he cultured
anthrax bacilli in suitable media on the microscopic slides and could
detect the spore formation. The spores were found to be able to produce
anthrax even after prolonged exposure to the environment. It was for the
first time that a bacterium was shown to be the cause of a particular
19th Century Medicine | 145

disease. When the work on anthrax was presented in 1876, Julius Cohn,
a famous pathologist commented: “It leaves nothing more to be proved.
I regard this as the greatest discovery made with bacteria. I believe that
this is not the last time that this young Robert Koch will surprise and
shame us by the brilliance of his investigations.” In 1877, Koch published
an important paper on the investigation, preservation, staining and
photography of bacteria. He illustrated his work by superb photo-
micrographs.
By now Koch was recognized as a scientific investigator of the first
rank. He was rewarded in 1880 with a job at the Imperial Health Office
in Berlin, where he set up a laboratory in bacteriology and started the
work on tuberculosis. In those days tuberculosis was the cause of death
in one out of seven cases; about one-third of the people in productive
middle age group died of tuberculosis. Within two years, Robert Koch
was able to isolate the tubercle bacilli from the tubercular lesions, culture
them in the laboratory and using the culture could produce tubercular
lesions in experimental animals. He presented his work in March 1882.
Koch’s lecture was so innovative and thorough that it set a precedent for
future presentations in medical research. Koch brought his entire
laboratory to the lecture room—microscope, test-tubes with culture
media, glass slides with stained bacteria, tissue samples and many other
things. Koch explained in detail all the experimental proofs about
tubercular bacilli and their role in the production of tubercular lesions.
At the end of the lecture, the audience was stunned into silence—no
applause, no congratulations. Slowly the people got up to see the
tubercular bacilli under the microscopes with their own eyes. That
evening the ghost of evil air (miasma) as a cause of diseases in the
humans was finally laid to rest. Within a few months, news of Koch’s
discovery spread all over the world. Robert Koch became famous as
“Father of Bacteriology.”
146 | History of Medicine

In 1891, Koch became Director of a new institute of infectious


diseases in Berlin. During this period he had a team of eminent scientists
who made numerous investigations into the microbes that produced
diseases such as malaria, leprosy, bubonic plague, cholera, sleeping
sickness, conjunctivitis, etc. He traveled to many countries such as
Egypt, Africa, India, etc. on the invitation of the respective Governments
to help them in the study of infectious diseases prevalent in those
countries.
Robert Koch is credited with the establishment of methodology of
research in bacteriology. He devised culture media such as gelatin, and
agar and the Petri dish (named after his assistant, Petri), as well as many
bacterial staining techniques. Koch laid down certain criteria, which
must be satisfied before it can be accepted that a particular micro-
organism causes a particular disease. The criteria, which came to be
called Koch’s postulates, are as follows:
• The microbe must be present in every case of the disease.
• The microbe must be isolated from the diseased “host” and grown
in a pure culture.
• The disease must be reproduced when the pure culture is introduced
into a nondiseased susceptible “host”.
• The microbe must be recoverable from the experimentally infected
“host”.
On the basis of experimental methods advocated by Koch, 21 types
of germs that caused diseases in human had been identified by the year
1900. These included the microbes that caused diphtheria, typhoid,
pneumonia, gonorrhea, cerebrospinal meningitis, leprosy, bubonic plague,
tetanus and some others. “As soon as the right method was found,
discoveries came as easily as ripe apples from a tree,” commented
Koch in 1901. It was Koch who had developed the right methods.
Due to his extensive work on cholera in Egypt and India in 1884,
Koch has been considered discoverer of cholera bacillus, Vibrio cholerae.
19th Century Medicine | 147

Actually, as early as 1854, an Italian Physician, Filippo Pacini (known


for the discovery of the sensory receptors named after him as pacinian
corpuscles) had described the comma-shaped cholera bacillus in a research
publication in Italian language. It was pacinian who gave it the name
Vibrio. His microscopic slides of the organism were clearly labeled,
identifying the date and nature of investigation. In a series of papers in
1866, 1871 and 1880, he described the clinical picture of rice-water
stools with accompanying loss of water and electrolytes and even
recommended the treatment by intravenous injection of 10 grams of
sodium chloride in a liter of water. The discovery was completely ignored
since the germ theory of disease was yet to be accepted. The lobby for
the miasma theory was so strong that, in an international conference as
late as in1885, the British delegation successfully blocked any “theoretical
discussion on the etiology of cholera.” The basic question is whether
Koch was aware of the work of Pacini on vibrio. How come; he too
named it vibrio? In any case the injustice towards Pacini was rectified 82
years after his death, when an International Committee on Nomenclature,
in 1965, adopted Vibrio Cholerae Pacini (1854) as the correct name for
the cholera bacillus.

PAUL EHRLICH
Paul Ehrlich (1854-1915) (Fig. 4.38) was a German scientist who is
remembered for his work on hematology, immunology, and chemotherapy.
He was awarded Nobel Prize in Medicine, 1908.
Ehrlich was interested in staining microscopic tissues even as a
medical student. Soon after his graduation in medicine, in 1878, Ehrlich
published a method to stain different types of white blood cells. Thus
he laid the foundation of future work in hematology. In 1882, he devised
a method of staining the tubercle bacillus, which had been discovered by
148 | History of Medicine

Robert Koch in the same year. He also


developed the Gram method of staining.
In 1896, Ehrlich was appointed
Director of a newly developed Institute
for Production of Therapeutic Sera. In this
phase of his career, Ehrlich was associated
with another future Nobel Laureate, Emil
von Behring. They made important
discoveries in the field of immunology and
developed the antidiphtheria serum. They
also devised the method to standardize the
potency of the antisera.
In1899, Ehrlich became the Director
of a newly established Royal Institute of Fig. 4.38: Paul Ehrlich
Experimental Therapy. This phase of his life was devoted to the
development of chemotherapy or what he called “Magic Bullet”. The
basis of the concept of “magic bullet” was that just as a stain acts on a
specific type of bacteria, a chemical could be made available which
attacks the particular type of bacteria, without affecting any other tissue
cells. In this endeavor, Ehrlich found an arsenic compound he named as
“Salvasan” which could kill the organism causing syphilis, the most
dreaded venereal of that era. Salvasan was not exactly a magic bullet
since it had many side effects on tissues of the body. In fact, no
antibacterial agent could be called a magic bullet. Only the recently
developed monoclonical antibodies act on the specific antigens and
therefore can be given the fanciful name of “magic bullets.”
Paul Ehrlich was one of the most famous German scientists of that
era. He was elected member of as many as 81 academic societies spread
over European and American continents. He was awarded honorary
doctorates and titles by the numerous universities and governments.
19th Century Medicine | 149

FLORENCE NIGHTINGALE
Florence Nightingale (1820-1910)
(Fig. 4.39) was the founder of the
modern nursing profession. During the
Crimean War in 1854, her visits to the
sick soldiers at night carrying a dim light
earned the name, “The Lady with the
Lamp,” among the grateful soldiers.
What is not well-known is the fact that
she was a noted statistician and used
that skill to improve the sanitary
conditions of the army hospitals in
England as well as India.
Florence was born in a highly
educated and extremely rich British Fig. 4.39: Florence Nightingale
family. She was born in the Italian town Florence when her parents were
touring Europe for the first two years of their marriage. She was given
the name of the town of her birth. In her circle of well-to-do families, a
young lady’s life consisted of intercontinental tours, elaborate dinner
parties, operas and socializing with the rich and the mighty and finally
marrying an eligible young man to ‘live happily ever afterwards’. Florence
was not excited by such activities. On the other hand, Florence expressed
a desire to study mathematics. The parents urged her to study subjects
“more appropriate for a woman.” After many emotional battles, Florence
got permission to study mathematics. She became very proficient in the
subject, especially in statistics. In later life, the knowledge of statistics
played a big role during her crusade for improving the conditions of
army hospitals.
Inspired by what she understood to be a divine calling, Nightingale
decided to devote her life to the care of the sick and the poor by becoming
150 | History of Medicine

a nurse. Her family was horrified. In those days, nursing was the most
disreputable profession for a woman. It was taken up by retired disabled
army men or by poor and destitute women with no other means of
support. The image of the profession can be judged from the description
of nurses by a London doctor: “They are all drunkards, without
exception, always tipsy day or night, whom the doctor could seldom
trust to give the medicine to the patients. Immoral conduct is believed to
be practiced in every ward.” In short, nurses were considered just a little
less than the prostitutes. The parents were totally against her choosing
such a profession. Nightingale fought with parents for eight long years,
before she was allowed her to take nursing as a career.
Florence Nightingale’s nursing career began in 1851 when she received
four months’ training as a nurse in Germany. Back home after training
she regularly visited various hospitals in London, Edinburgh and Dublin.
In 1853, she accepted her first administrative post when she became
superintendent of the Hospital for Invalid Gentlewomen. Her father
had given her an annual income of 500 pound sterling that allowed her to
live comfortably and pursue the nursing career.
In March 1854; the Britain and France declared war (the Crimean
War) against Russia. William Russell, the Times’ correspondent, described
the terrible neglect of the wounded British soldiers as compared to much
better care of the wounded French soldiers. Stung by the reports, Sidney
Herbert, the Secretary for War in the British Government, and a family
friend of Nightingales, requested Florence Nightingale to proceed to the
war front in Turkey. With a band of 38 women volunteer nurses,
Nightingale arrived in Scuteri, a town in Turkey, where the British
troops were stationed. She found the conditions of the sick or wounded
soldiers appalling. There were no vessels for water or utensils of any
kind; no soap, no hospital clothes; the men were lying in the hospital in
their uniforms soaked in blood and filth, their bodies covered with
vermin. There was not a drop of milk. The meat was more like moist
19th Century Medicine | 151

leather than food. Blankets were rotting in the warehouses while the
men had none; because proper forms for their distribution were not
issued. The lavatories in the hospital consisted of tubs which had to be
emptied by hand. Since no one had been specifically ordered to do so,
the stench could be smelled miles away from the hospital. Far more men
were dying due to hospital infections than due to war injuries. Conditions
were so bad that cholera and typhus fever broke out in the hospital.
Besides a large number of patients, seven army doctors and three of the
nurses died. There were 2000 sick soldiers in the hospital and the death
rate was as high as 42 percent. With her own money and money donated
by volunteer organizations, she set up to procure wash basins, soaps,
towels, mops, etc. New laundry was set up. Good food was supplied to
the soldiers. She provided reading rooms for the convalescents. She
prescribed a smart but sober uniform for
the nurses, not only to promote
cleanliness, but also to disarm the critics
and give nurses a respectable look. Due to
her recommen-dations, repairs were
carried out in the barracks and hospital.
Buildings were ventilated and warmed.
Water supply improved, drainage system
introduced or reconstructed. Florence
Nightingale was on her feet for twenty
hours a day. She did not allow other nurses
in the wards after 8 pm, but she frequently
visited the patients carrying a lamp. The
wounded soldiers called her the Lady with
the Lamp (Fig. 4.40). Fig. 4.40: Lady
All the improvements in the hospital with the lamp
mentioned above were carried out against the wishes of the army
authorities. They were highly resentful of her work and considered her
152 | History of Medicine

advice an unnecessary interference. The hospital welfare activities were


said to “spoil and soften the brutes.” But Nightingale found support
from the British government when she sent reports of statistical analysis
of the deaths in the army hospital. At the end of the war, Nightingale
returned to Britain in August 1857 as a heroine. According to BBC, her
popularity was only next to queen Victoria herself.
In 1855, a public meeting to give recognition to Florence Nightingale
for her work in the Crimean War led to the establishment of the
Nightingale Fund to promote training of the nurses. By 1860, Nightingale
had 45,000 pounds sterling at her disposal. With this money, she set up
the first school of nursing called Nightingale Training School (now
known as the Florence Nightingale School of Nursing and Midwifery).
Before this school, nurses were women volunteers who belonged to
various Christian Churches and called Roman Catholic Sisters of Mercy
or Protestant Sisters of Charity. Even after the nursing profession was
properly organized under the initiative of Florence Nightingale, they
continued to be called “sisters”.
Under the strain of ceaseless overwork, her own health broke.
Nightingale was an invalid for later half of her life. Even from her sick-
bed Nightingale continued with her efforts to improve the sanitary
conditions in England and even in India.

Quotes of Florence Nightingale


• No man, even a doctor, ever gives any other definition of what a
nurse should be than this—‘devoted and obedient.’ This definition
would do just as well for a porter. It might even do for a horse.
• It may seem a strange principle to enunciate, but the very first
requirement in a hospital is that it should do the sick no harm.
• I attribute my success to this—I never gave or took any excuse.
• Women have no sympathy and my experience of women is almost
as large as Europe.
19th Century Medicine | 153

• How very little can be done under the spirit of fear.


• I think one’s feelings waste themselves in words; they ought all to
be distilled into actions which bring results.
• Apprehension, uncertainty, waiting, expectation, and fear of surprise,
do a patient more harm than any exertion.
• The progressive world is divided into two classes—those who take
the best there is and enjoy it—and those who wish for something
better and try to create it.

THE RED CROSS MOVEMENT


The Red Cross, with 100 million
members, is at present world’s largest
humanitarian organization. It was
founded by a Swiss businessman,
Henry Dunant, in 1863 (Fig. 4.41).
The idea of such an organization arose
in the mind of Dunant in 1859 when he
traveled to Italy to meet the French
Emperor Napoleon III, in connection
with his business in Algeria, (at that
time under occupation of France). On
way to meet the Emperor, Dunant
found himself in the thick of the
fierce Battle of Solferino being fought
between the French and Austrian forces.
Fig. 4.41: Henry Dunant
In a single day about 40,000 soldiers
on both sides were dead or wounded and left unattended by both the
armies. The condition of some of the wounded was so pitiable that they
requested Dunant to kill them so as to release them from the agony.
Dunant dropped the idea of business meeting with the Emperor and got
154 | History of Medicine

busy organizing help for the wounded in the make-shift “hospitals” in


the homes and barns of the Solferino village. The villagers gave full
cooperation and helped soldiers of both the sides. Back in Geneva,
Dunant wrote a book, “A Memory of Solferino,” in which he described
the horrible experience of the Battle of Solferino and urged people to set
up voluntary relief societies to help the victims of war. The book released
in 1862, was sent to many leading political and military figures in Europe.
Dunant also began to travel through Europe to promote his idea. The
book was positively received, and led to the foundation of Red Cross in
1863, with Henri Dunant as its first president. The emblem of the Red
Cross organization adopted was the reverse of Swiss Flag, a red cross
against white background. Dunant got so much involved in this movement
that his business suffered heavy losses and he was declared bankrupt in
1867. By this time, he found himself surrounded by colleagues in the
Red Cross Committee of Geneva who were against his leadership. Their
intrigues in Geneva led to his expulsion from the International Red
Cross Society. Dunant was no more welcome in Genevan society. He
left Geneva, never to return back. For the next ten years, he wandered
from place to place on foot, like a beggar. There were times, he said,
when he dined on a crust of bread, blackened his coat with ink, whitened
his collar with chalk, and slept out of doors. In 1875, he fell sick and
took refuge in a hospital of Heiden, a small Swiss village, where he
remained till death in 1910.
Dunant felt rehabilitated when he was awarded the first Nobel
Peace Prize, in 1901. However, he donated all the prize money to
charity and he continued to live in the same hospital. Upon his death,
there was no funeral ceremony, no mourners, and no cortege. In accordance
with his wishes, he was carried to his grave “like a dog”.
Gradually the Red Cross movement spread all over the world and
has been on the forefront during all the conflicts, whether localized or
World Wars I and II. At the end of WW I, it was instrumental in the
19th Century Medicine | 155

return of 420,000 prisoners to their home countries. A year before the


end of the WW I, Red Cross was awarded the Nobel Peace Prize,
1917. The organization also received Nobel Peace Prize in 1944 and
1963.

THE FIRST WOMAN DOCTORS


IN THE USA AND ENGLAND
In 19th century America, women entered the medical profession initially
as nurses and midwives. When they tried to seek admission to a medical
school, there was widespread protest and disapproval. The resentment
among the male doctors can be judged from the following statement:
“Most members of the medical profession perceive the medical education
of women as horrible and vicious attempt to deliberately ‘unsex’ them.
The acquisition of anatomical and physiological knowledge of man would
lead to gratification of their prurient and morbid curiosity and a thirst
for forbidden information. They would be performing medical and surgical
duties which nature intended for the sterner sex”. A similar view was
expressed in the resolution passed by the Philadelphia County Medical
Society in 1867: “The physiological peculiarities of woman even in
single life (unmarried), and the disorders consequent on them, cannot
frequently fail to interfere with the discharge of duties as a physician.
Interruption would be much greater on marriage and motherhood. The
delicate organization and predominance of the nervous system render
her peculiarly susceptible to suffer, if not sink, under the fatigue and
mental shocks which she must encounter in her professional rounds”.
Elizabeth Blackwell (1821-1910) (Fig. 4.42) was the first woman
doctor trained in the United States. Her father, Samuel Blackwell, was a
businessman in England. In 1832, he moved the family to the USA, but
did not do well. When he died, the family was without any financial
156 | History of Medicine

support. To support the family of ten


members, Elizabeth opened a small private
school.
The idea of becoming a doctor was put
in her mind by a woman family friend who
was suffering from cancer. “It is a terrible
thing to die a slow death like this. But for
me, one thing would have made the suffering
much easier if only I did not have to examined
and treated by a male doctor. Why don’t
you become a doctor?” she asked Elizabeth.
At first Elizabeth thought it an absurd idea
but gradually she became determined to be a
doctor. Her application was rejected by 29 Fig. 4.42: Elizabeth
Blackwell
medical schools. However, the Dean of the
Geneva Medical College in Western New York State felt that
constitutionally her application cannot be rejected. Therefore he chose
to ask all the (male) students to vote for or against her admission. The
condition was that even if one person voted against her, she will not be
admitted. The students were shocked. Their initial reaction was: “It is
impossible, ridiculous. Some body is spoofing us! Woman doctor! Next,
a male mother?” The students, reportedly taking it to be a practical joke,
(or probably better sense prevailed) unanimously voted for her
admission. Thus in 1847, Elizabeth was admitted to the medical college.
However, this was the beginning, not the end of her travails. When the
teacher of anatomy reached the chapter of reproductive organs, she was
advised to absent herself from the classes. She refused. During the
lecture, some of the students blushed, some others were hysterical, but
no one could suppress a smile. Even to Dr Welistar, the anatomist, the
lecture was trying, partly from the embarrassment and partly because
her presence had proved inhibiting to his customary ribald approach to
19th Century Medicine | 157

the naturally most popular section of the course of anatomy. Elizabeth


was embarrassed no less. “I sat through the lecture showing indifference,
though my heart palpitated painfully,” she recalled. Throughout the
two years course, she was ignored by her fellow students as well as the
tutors. Despite these problems she stood first in the final examination.
In 1849, she became the first woman to be awarded MD degree in the
USA. (In those days the medical course lasted only two years). Over
2000 people turned out to watch the ceremony. Even here, there was a
problem. She had to be given a special hand written diploma because the
traditional engraved diplomas were worded in masculine gender. She
was awarded the degree of Domino, the feminine equivalent of doctor
they could think of.
After graduation, Elizabeth did not succeed in getting a clinical
attachment to improve her clinical skills. She had move to France, where
she entered midwives course at la Maternite in Paris. During this training,
she got purulent ophthalmia from a baby she was treating and lost one
eye. In October 1850, she moved to England to work at a hospital with
the famous Dr James Paretic. It was during this period that she became
friends with Florence Nightingale. Together, they addressed a number of
public meetings in support of medical education for women. Elizabeth
Garrett Anderson, a young English girl, attended one of such lectures.
She also decided to become a doctor. After many years of struggle,
Elizabeth Garrett became the first woman to obtain the medical degree
in England.
In 1851, Elizabeth Blackwell returned to New York, where no hospital
or dispensary agreed to give an appointment. She was even refused
lodging and office space by landlords when she tried to set up private
practice. Ultimately, she had to purchase a house where she started
practice at a small scale. According to her, the first consultation was a
curious experience. In a severe case of pneumonia in an elderly patient,
she called a kind-hearted elderly physician to seek a second opinion.
158 | History of Medicine

This gentleman, after seeing the patient, went with her into the parlor,
where he began to walk about in great agitation, exclaiming, “a most
extraordinary case! Such one never happened to me before. I really
don’t know what to do”. I was surprised to hear this, because it was a
straight-forward case of pneumonia, not in much degree of danger. Then
he explained that the difficulty was the act of medical consultation with a
female doctor, which he had never done before!
In 1853, Elizabeth opened a dispensary in the slums of New York,
where she was joined by her sister, who too, had obtained a medical
degree. Together they established the New York Infirmary for Women
and Children, in 1857, and later in 1869, established the first Women’s
Medical College, exclusively for training of women doctors. In1875,
Elizabeth Garrett established the London School of Medicine for Women,
and invited Elizabeth Blackwell to take up the post of professor of
Gynecology. Blackwell remained on this post till 1907.
The discrimination against women doctors in the USA continues to
this day. Most of the deans openly hold women doctors in low esteem.
The reasons given are somewhat amusing: “emotionally unstable”; “talk
too much”; or even “get pregnant”. According one dean of a medical
college: “I would prefer a third rate male to a first rate woman doctor.”
Even nowadays, 84 percent of practicing doctors in the United States
are males. The American Medical Association never had a woman
president in over 150 years of its existence.
Elizabeth Garrett Anderson (1836-1917) (Fig. 4.43) was the first
woman doctor registered in England. In 1859, after attending a lecture
by Elizabeth Blackwell on “Medicine as a Profession for Ladies,”
Elizabeth decided to become a doctor, an un-heard-of-thing in those
days and regarded by some even indecent. At first, she tried to be a
surgical nurse. She was the only woman in the class, and was banned
from full participation in the operation theater. When she came first in
19th Century Medicine | 159

the class, her fellow students had her


banned even from the lectures. Then, she
applied to many medical schools but was
rejected by all. She was admitted
ultimately, to the Apothecaries course as
a private candidate. She was admitted
merely because their rules were silent on
this matter. Just after her admission, the
society of Apothecaries amended their
rules and no more women were to be
admitted to the course. (Apothecary is a
historical name for medical practitioners
who formulated and dispensed material
medica to physicians, surgeons and Fig. 4.43: Elizabeth
patients—a role now served by Garrette Anderson
pharmacists. In addition to drugs, an apothecary was allowed to offer
general medical advice to the patients).
After getting license of Apothecaries’ Hall in 1865, Garrett was
appointed general medical attendant to St Mary’s dispensary in London,
an institution which provided medical care to poor women by medical
practitioners of the same sex. In 1870, she obtained a degree of MD
from the University of Paris. She continued to work in the same dispensary
for the poor, which was upgraded into the New Hospital for Women.
Garrett served in this hospital for 20 years. She was instrumental in the
creation of the London School of Medicine for Women in 1877. It may be
mentioned that the British accepted a woman doctor more gracefully
than the Americans. In 1897, Dr Garrette was even elected the president
of the Anglican branch of the British Medical Association. In 1908, she
was elected Mayor Aldeburh, the first woman mayor in the whole of
England.
160 | History of Medicine

DISCOVERY OF ANESTHESIA
In preanesthetic days, surgeons had tried various measures like alcohol,
opium, and hashish to lessen pain during surgery, but with little success.
Some more stubborn, went to the extent of compression of carotid
arteries till unconsciousness supervened or produced cerebral concussion
by striking a wooden bowl placed on the head!
In the year 1799, Humphrey Davy, an English professor of chemistry
discovered nitrous oxide (laughing gas). It was taken by the showmen
to provide entertainment at village fairs. Thirty years later Michael
Faraday showed that ether could produce similar effect. Both gases
continued to be used at pleasure parties where the vapor was used for
its exhilarating effect.
In 1844, it was Horace Wells (Fig.
4.44), an American dentist, who detected
the anesthetic property of the laughing
gas. In a laughing gas party, one of the
participants fell and injured his leg.
Horace Wells, who was present, noticed
that the participant was totally unaware
of the injury. On enquiry, the person told
Wells that he felt no pain. The following
day, Wells got his own tooth pulled out
under the effect of the laughing gas. He
tried to take advantage of the discovery
by staging a public demonstration in Fig. 4.44: Horace Wells
1845. Unfortunately, he bungled with the
timing and the patient howled with pain when the tooth was pulled out.
Wells earned only ridicule from the public “It is all humbug,” declared
the people.
19th Century Medicine | 161

After the failure of Wells, his pupil


dentist William Morton (Fig. 4.45)
continued experiments on himself with
different gases. One day (in 1846), he
inhaled ether from his handkerchief, and
fell unconsciousness for about eight minutes.
After that, he began to use ether for all the
tooth extractions at his clinic. He publically
demonstrated the anesthetic property of
ether when a tumor was surgically removed
from the neck of a young man. The patient
felt no pain during 25 minutes of surgery.
“Gentlemen, this is no humbug”, was the Fig. 4.45: William Morton
verdict of the operating surgeon.
Subsequently, Morton, typical of an American, tried to make a
fortune out of his discovery. He did not reveal that ether had been used
during the operation. He named it “Letheon.” He tried to patent his
“secret formula”. He devised a special apparatus (none was needed) to
make his claim for the patent stronger. There were strong arguments and
recriminations about the originality of the discovery. Finally, he lost
both his dental practice and his mental balance. He died in 1868, penniless
and dishonored.
In England, James Young Simpson, a professor of midwifery,
started using ether for “painless delivery”, which became highly popular
among the rich clients. However, use of ether was not without difficulty.
Often the patients under the effect of ether became violent and attacked
Dr Simpson himself. That is why; he started experimenting with other
gases. One evening, in 1847, he along with a few friends inhaled
chloroform. At first they became very happy and hilarious, but suddenly
all of them fell unconscious on the ground. Subsequently, Simpson
started using chloroform in all cases of labor.
162 | History of Medicine

The use of anesthesia during labor came to be opposed on religious


grounds. Alleviation of pain during labor was considered against the law
of nature. The opposition died down when the Queen Victoria of England
received chloroform during her first delivery in 1853. The Queen reported
that the effect was soothing and delightful beyond measure. Subsequently,
however, chloroform was found not to be absolutely safe; many deaths
were reported with its use. Still, both ether (in USA) and chloroform (in
England and Europe) continued to be used in surgical practice, till other
safer anesthetic agents were discovered.
Cocaine was the first local anesthetic used by Karl Keller, in
1884, for ophthalmic surgery. Safer replacements were gradually obtained
from derivatives of cocaine—procaine (1905), and lidocaine (1943).
Acceptance of anesthesia made more protracted surgery feasible,
but it did not revolutionize surgery because of high death rate due to
postoperative infections. The golden era of surgery began only when
aseptic surgery was accepted; blood transfusion became feasible and the
antibiotics were discovered (1940s).

THEODOR BILLROTH
Theodor Billroth (1829-1894) (Fig. 4.46) was a
German-born Austrian surgeon who is regarded
as the founding father of modern abdominal
surgery. Billroth introduced epoch-making
treatments which, for subsequent decades,
constituted standard surgical procedures on the
stomach, bile duct and female genitalia. For many
years, because of innovative work by him and his
pupils, his department was famous as the Vienna
School of Surgery. Some of the operations
developed by him are still in use, as such, or in a Fig. 4.46: Theodor
modified form. Billroth
19th Century Medicine | 163

After his graduation in 1852, he visited various medical schools in


Europe such as Vienna, Prague, Paris, Edinburgh and London. In 1867,
he became a professor of surgery in Vienna, a post he retained till death.
This was the most fruitful period of his surgical career. He was a pioneer
in the study of the bacterial causes of wound fever. Billroth was quick to
adopt the antiseptic techniques of Lister and performed many hazardous
operations successfully because of his great ability and caution. Because
of the threat of fatal surgical infections eradicated, Billroth proceeded to
operate on organs hitherto considered inaccessible. In 1872, he was the
first to remove a section of the esophagus and join the remaining parts
together. In 1873, he performed the first complete excision of the larynx.
He was the first surgeon to excise a rectal cancer and by 1876, he had
performed 33 such operations.
By 1881, Billroth had made the intestinal surgery seem almost
common place. Now he was ready to attempt the most formidable
abdominal operation conceivable at that time, namely the excision of the
cancerous pyloric part of stomach followed by end-to-end anastomosis
with the remaining part of the stomach (Billroth operation I) or with the
small intestine (Billroth operation II). After the death of the first patient
with such surgery, Billroth was almost stoned to death in the streets of
Vienna. Soon, however, he was able to perform such surgeries
successfully, which became a great sensation in Europe and the USA.
Budding surgeons from all over the world loved to work in his department
to learn his surgical techniques.
Billroth was famous for his presence of mind and cool-headedness
during surgery. He showed a great ability to invent out a new procedure
that might be demanded in a particular case. At the same time, he was
full of consideration for the comfort and well-being of his patient. He
never forgot that he had before him a human being to be treated, not a
mere “interesting case” for display of his surgical dexterity. Possessing
a clear and graceful style, he was the author of numerous papers and
164 | History of Medicine

books on medical subjects. He was of artistic disposition and fond of


violin. He even wrote a book, “Physiology of Music,” which was published
by his admirers in music after his death.
Billroth is regarded by many as the leading German surgeon of the
19th century. He had radical ideas for the training of a surgeon. He
advocated a prolonged surgical apprenticeship on completion of medical
studies, consisting initial operations on cadavers and animals followed
by 2–3 years of assistantship in the surgical department. One of the
many visitors who adopted this method was William Halstead, the
pioneer of American surgery and initiator of residency program in the
USA.

Quotations of Theodor Billroth


• Only a man who is familiar with the art and science of the past is
competent to aid in the progress in its future.
• Become familiar not only with teaching but also with writing.
• The pleasure of being a physician is little, the gratitude of patients
is rare, and even rarer is material reward, but these things will never
deter the student who feels the (God’s) call in him.
• The physician, the school teacher, the lawyer and the Clergyman
should be the best men of their village or their city, of the circles in
which they move.
• One may perform surgical procedures only if there is a little chance
of success. To operate without having a chance means to prostitute
the beautiful art and science of surgery.
• It is most gratifying sign of rapid progress of our times that our best
textbooks become antiquated so quickly.
• Be truthful to admit failures, as they would show the mode of
improvement.
19th Century Medicine | 165

• He, who combines the knowledge of physiology and surgery, in


addition to the artistic side of his subject, reaches the highest ideal in
medicine.
• A physician should never appear to be in hurry, and never absent-
minded.
• The art of winning a patient’s confidence lies in the art of listening.
A patient is always more anxious to talk than to listen.

WILLIAM STEWART HALSTED


William Stewart Halsted (1852-1922)
(Fig. 4.47) is known as the father of
American surgery. Founder of the
residency training system of progressive
responsibility, he is remembered for
many other medical and surgical achieve-
ments. He performed the first radical
mastectomy for cancer breast. His other
achievements include advancement in
techniques for thyroid, biliary tree, hernia,
and intestinal surgery. He also introduced
the use of gloves during surgical
Fig. 4.47: William Halsted
operations.
After graduation in 1877, Halsted went to Europe for further surgical
training in Austria and Germany for two years. In Vienna, he worked
under Theodor Billroth, the most well- known surgeon of those times.
Halsted firmly believed that the leading surgeons in Germany, Austria
and Switzerland in those days were the best in the world. Therefore, he
visited all these surgeons, one by one, to get acquainted with their
surgical techniques. He was highly impressed by the method of training
of a surgeon in Germany. Based on this pattern of training, the residency
166 | History of Medicine

system of postgraduate training was initially started by him in the John


Hopkins Hospital. Subsequently, all other teaching hospitals of the
USA adopted the residency system.
On return from Europe in 1880, Halsted soon became a successful
surgeon in the New York City. About this time, he started research work
on cocaine mainly by self-experimentation. In 1885; he reported the
effect of injection of cocaine into the nerve trunk to block the sensation
of pain. It was the first effective method to operate on a patient without
putting him under general anesthesia. The importance of this discovery
can be judged from the fact that derivatives of cocaine are used as local
anesthetic agents to this day. However, the experimentation proved
disastrous for the surgical career of Halsted and several other colleagues
and students who participated in the research work. All these individuals
became addicted to cocaine. Halsted was no more allowed to perform
surgeries in the New York Hospital.
Relieved from surgical work, Halsted chose to become an assistant
to William Henry Welch, a pathologist in Maryland. This forced
experience in animal experiments was to prove a boon to Halsted’s
surgical skills. Experimenting on dogs, he perfected many techniques for
modern surgery. He learnt the importance of complete asepsis, absolute
control of bleeding, accurate anatomical dissection, exact approximation
of tissues in wound closure and gentle handling of tissues. When he
joined the newly opened, John Hopkins Hospital, in 1890, the experience
of surgeries on animals helped him to become a renowned surgeon once
again. He had such a firm belief in the animal experimentation that he
used to say: “The license to practice general surgery should be withheld
from those who have not practiced surgery on animals”.
The surgical gloves were introduced by Halsted in 1890. Originally,
he got the thin rubber gloves prepared to protect the hands of his
fiancée, the chief operation theater nurse. She had developed skin allergy
to the antiseptic chemicals used in the operation theater in those days.
19th Century Medicine | 167

Subsequently, the importance of surgical gloves in the aseptic surgery


was realized and all the members of the surgical team of Halsted began to
use the gloves.
Surgery on the inguinal hernias had been associated with high
mortality. Halsted developed a new technique of operation (Halsted’s
operation I) that is used even today. Similarly, Halsted’s operation II for
the radical mastectomy for carcinoma breast remains a gold standard to
this day.
The icon of aseptic surgery, William Halsted died of postoperative
infection, following a gallbladder operation in 1922, though he was
operated by his own pupil, using his own surgical technique! (That was
preantibiotics era).

MEDICAL MISSIONARIES
The art and science of medicine has not always been used merely to treat
the sick or to prevent sickness in a population. Since the middle of 19th
century, medicine has been used to propagate Christianity in various
parts of the world, especially the African continent, Indian subcontinent,
South East Asia and China by using medical missionaries.
When Christian missionaries were initially sent to parts of Africa
and Asia, they encountered very high mortality due to illnesses like
cholera, malaria and many other diseases not found in Europe. For
example, the average life expectancy of a missionary in Africa in 19th
century was eight years. It was reported that 61 percent of deaths of
British missionaries were preventable by better medical care. Among
the women missionaries in Africa, the death rate during childbirth was as
high as 5 percent as compared to one maternal death per thousand
childbirths in Europe. As a result, it became prudent to give some
missionaries an elementary course in medicine so that they could protect
each other. Later, some regularly trained doctors were employed to look
168 | History of Medicine

after the health of all the missionaries in a region. These missionary


doctors began to provide health to nonmissionaries also and became
popular among the general population. Soon it was realized that the
work of these medical missionaries could be used as a powerful tool of
Christianizing the populations, who were otherwise hostile and resistant
to the preaching of the teachings of Jesus Christ (Gospel). A resolution
to this effect was passed in a missionary convention England in 1860.
As a result, missions sent doctors to difficult areas with the expressed
purpose of softening up the people for receiving the Gospel later. The
medical missionary’s assignment was purely to act as a bait to help the
preachers who came in the field later on.
Besides providing medical care, the medical missionaries began to
train the local population in Western medicine. They opened medical
schools, where Christian converts were given free education. The
Christian Medical School Vellore (India) was one of many such schools
opened for this purpose. In 1850, there were only ten to fifteen medical
missionaries. By 1900, there were 650 of them. Their number further
increased throughout the 20th century. In 1963, there were 1231 mission
hospitals. These were staffed by 828 missionary doctors and 1321
missionary nurses, as well as, 1317 local doctors and 6928 local nurses.
The important role of medical missionaries in the propagation of
Christianity can be judged from the following account: The first
missionary to Korea came off the passenger ship in 1869, and was
bludgeoned to death 15 minutes after stepping on shore. It took a
Christian physician, Dr Horace Allen, to establish the necessary
beachhead which permitted the missionaries to enter Korea in 1884 and
stay. Dr Allen was able to save the life of a nephew of the King when the
local physicians could not stop the bleeding from a deep wound. After
the treatment, the prince claimed that Dr Allen could not have come
from America, but straight from heaven. The laws were changed to allow
19th Century Medicine | 169

Koreans to become Christians and today, Korea has the largest Christian
community in South East Asia.
Some of the medical missionaries
took over much bigger role than was
intended from them. One such example
is Dr David Livingstone (Fig. 4.48).
David Livingstone was trained as a
missionary but had some training as a
physician, as well. He was assigned to
preach the Gospel and bring “civili-
zation” to the barbarians of Africa. He
was the first European to meet the local
tribes. He gained their trust as a healer
and medical man. Soon, his chief
interest became the exploration of the
African continent. He was a prolific Fig. 4.48: David Livingstone
writer and his journals, letters and published narratives provided the
first account of many diseases prevalent in Africa. He traveled far and
wide in the continent and published accurate maps and trade routes,
which led to the race among the European countries for colonization of
the “Dark Continent”.

HISTORY OF WESTERN
MEDICAL EDUCATION IN INDIA
By the beginning of 19th century, the British Government controlled a
large part of India. Consequently, there was a marked increase in the
number of British soldiers and civilians in the country. The number of
British doctors posted in India was insufficient for the requirement of
so many white men and women. To overcome the shortage of medical
170 | History of Medicine

manpower, it was decided to train some


native Indians in the art of Western
medicine in addition to the indigenous
Ayurveda. In 1822, “The Native
Medical Institution” was started by the
Army in Calcutta, the capital of British
India. The students of Ayurveda in the
Sanskrit College were taught Sanskrit
translations of European works in
anatomy, medicine and surgery. One of
the teachers of this institution was
Pandit Madhusudan Gupta (Fig. 4.49).
He translated some of the Western
medical books into Sanskrit. From 1822 Fig. 4.49: Pandit
to 1835 the institution trained 166 Madhusudan Gupta
native doctors who were employed as assistants to the British doctors.
In 1933, it was decided by that the medical education to the natives
would be purely in English language and the official patronage to
indigenous medical systems was terminated. With this view in mind, the
Calcutta Medical School was opened. As expected, it was manned by
British doctors, but the services of Madhusudan Gupta were retained in
the new institution. The most ticklish problem faced by the newly
admitted students was the dissection of the cadavers, an essential
component of western medical curriculum. Since no high caste Hindu
was expected to touch, much less dissect a corpse, it was left to
Madhusudan Gupta to initiate the practice. On 10th January 1836,
Pandit Madhusudan Gupta led a team of four students (including his
own son) and dissected a human corpse. This act was taken as a sign of
great advancement of the natives. Guns boomed in the Calcutta Fort.
Madhusudan was feted and lionized by the British teachers. A
commemorative plaque was unveiled in the school that can be seen in
19th Century Medicine | 171

the Anatomy Theater even today. In 1838, four students (out of 11


students of the first batch) managed to pass the grueling examination
spread over seven days. All of them were absorbed in government jobs.
By 1860 the student strength increased, but only few managed to pass
because the quality of students admitted was very poor. However, even
those who failed started practice in the Western system of medicine
because no registration was required. Gradually, the criteria for admission
were made stricter.
Within ten years of the opening of the first medical school, the
British government opened medical schools at Madras and Bombay as
well. Later medical schools imparting western medical education started
functioning in other big cities like Agra, and Lahore. Christian missionaries
also opened some medical schools. The first among them was the
Christian Medical School, Ludhiana in Punjab established in 1894. This
medical school admitted only women students, who were taught by
Christian women doctors only.
In the beginning of 20th century, there were a good number of Indian
doctors with British medical degrees, but none of them was ever
appointed as a teacher in the clinical departments of the medical colleges
managed by the British government. The appointment of Indian doctors
was restricted to nonclinical departments only. Nationalist Indians in
Bombay decided to take corrective measure by a plan to start a medical
college to be staffed by properly qualified Indian teachers only. Notable
figures to lead the proposal were Dr Bhadurji and Dr Jivraj Mehta. The
first major problem of financing the project was overcome by a donation
of Rs 12 lacs by successors of Seth Goverdhandas Sunderdas, a rich
merchant of Bombay. Dr Jivraj Mehta had just returned to Bombay
from England after receiving the coveted MD degree. He suggested a
radical departure from the existing designs of medical colleges in India
where isolated blocks housed different departments. Dr Mehta suggested
that the entire new medical college be housed in a one large building and
172 | History of Medicine

the hospital in a separate but adjacent building. The two buildings were
to be interconnected by a covered corridor so that the doctors, students
and the patients could easily move from one building to another. Built
on this plan, the first truly Indian medical college, The Seth Goverdhandas
Sunderdas Medical College started functioning in Bombay in 1926. This
was the first multistoreyed medical college and attached hospital in
India. To keep the British Government in favorable frame of mind, the
attached hospital was named as King Edward VII Memorial (KEM)
Hospital. Almost all the medical colleges built subsequently in India
were designed on the architecture of Seth GS Medical College.

DR JIVRAJ MEHTA
Dr Jivraj Mehta (1887–1978) (Fig. 4.50)
was the founder-architect and Dean of the
first Indian nongovernment medical
college, Seth Goverdhandas Sunderdas
Medical College and KEM Hospital,
Bombay.
Mehta was born in a poor family of
Gujarat. He was able to continue his
school education by giving tuitions to
other students. His medical education in
Grant Medical College, Bombay was
financed by the father of one of his tution-
pupils, Dr Eduljee and another scholarship
Fig. 4.50: Jivraj Mehta
offered by a trust. His family was so poor
that Mehta sent part of the scholarship money to his parents. During
the undergraduate student career, Mehta worked so hard that he bagged
seven of the eight prizes offered to his batch of students. His brilliant
undergraduate career was responsible for the grant of a student loan
19th Century Medicine | 173

from the Tata Education foundation, Bombay for further medical


education in the UK.
Mehta studied in London from 1909 to 1915. He passed MD
examination in 1914 with a university medal and became a Member of
the Royal College of Physicians, London in 1915.
On his return to Bombay, Mehta started an extremely rewarding
private practice. But within less than a year, he was requested by Sir
Ratan Tata, a celebrated industrialist of Bombay to accompany him to
London, where he was going for medical treatment. Probably because
the Tata family had paid for his education, Dr Mehta discontinued his
medical practice and traveled to London with Sir Tata in a ship. Those
were the days of World War I. The ship was torpedoed by the Germans
but both Sir Tata and Dr Mehta survived the ship-wreck.
In London, Dr Mehta himself developed pulmonary tuberculosis
which necessitated a prolonged stay in a sanatorium in Switzerland. In
Switzerland, Dr Mehta met Sir Sayajirao Gaikwad, the Maharaja of the
State of Baroda. The Maharaja was so much impressed that he invited
Mehta to become his personal physician. Mehta accepted the offer and
worked in Baroda for over two years. In the mean time, a group of
Indian doctors and philanthropists in Bombay initiated a move to start
a new medical college to be manned by Indian doctors only. Dr Mehta
was unanimously elected to lead the development of the new college.
According to Dr Mehta, the mandate was to show that Indians could
run a medical college as properly as, if not better than, the British
doctors. To this end, Dr Mehta worked day and night and made GS
Medical College and KEM Hospital an example of excellence.
Throughout his medical career, Dr Mehta was closely associated
with freedom struggle. Consequently, after retirement as Dean of GS
Medical College, Dr Mehta occupied many important ministerial posts
culminating in the appointment as the first Chief Minister of the State
of Gujarat (1960-63) and Indian High Commissioner in London
(1963-66).
174 | History of Medicine

TROPICAL MEDICINE—
A BYPRODUCT OF IMPERIALISM
Imperialism played an important role in the development of medicine,
especially the tropical medicine. By the middle of 19th century, the
industrialization of Europe had created a huge demand for the raw material
and desire for exclusive overseas market. Consequently, many European
countries began to emulate Britain in creating overseas empires. Africa,
the Dark Continent, was the largest prize for the grabs. Germany, France,
Holland and Belgium were vying with Britain for the stake in the spoils.
Much of the land still available for colonization had tropical climate and
tropical diseases. If medicine could tame the diseases that were rampant
in the tropics, it could be an important political tool of the empire. The
country with the most advanced medical capabilities stood the greatest
chance of success in the hostile environment of Africa and South East
Asia. Therefore, any country with imperialistic ambitions had to first
master the tropical medicine. Britain, which already was in control of
greater part of India, was first to realize this fact and a School of Tropical
Medicine was established in London, in 1899.
(Sir) Patrick Manson (1844-1922), a Scot, was the first practitioner
of tropical medicine. He had gone to Far East as a medical officer. He
worked in Amoy, off the coast of South East China. He published an
important work on the causative role of Filaria—nematode worm in
elephantiasis— the chronic disfiguring disease leading to massive swelling
of the limbs and genitalia. He implicated the bite of mosquito in the
spread of disease and showed the benefit of mosquito nets in the
prevention of elephantiasis.
Back in England in 1889, Manson became a successful London
consulting physician, specializing in the diseases contracted by Europeans
in tropical climates. In 1898, he published a textbook, “Tropical diseases.
A Manual of the Diseases of Warm Climates,” and in 1899, the School of
19th Century Medicine | 175

Tropical Medicine was established by him in London. Manson advocated


tropical medicine as a specialty of medicine, distinct from bacteriology,
because tropical diseases like Bilharzias, sleeping sickness, amoebic
dysentery and malaria were spread by classes of organisms other than
bacteria. Within a few years of the foundation of the London School and
its rival in Liverpool, institutions in tropical medicine were established
in France, Germany, Italy, Belgium and the United States. These schools
of tropical medicine were primarily established for the protection of
European armed forces and businessmen stationed in tropical countries.
Only slowly and grudgingly, the benefits of research in tropical medicine
were shifted from European settlers to the local population.

THE ANTIPYRETIC ERA


Fever has been recognized as a sign of ill health since the times of
Charak and Hippocrates. However the bacterial and viral infections as
the actual cause of most of the fevers was not recognized till the end of
19th century. Up to that time fever itself was considered a disease but
no treatment was available till cinchona bark was found to be effective
for the malarial fever. For all other fevers, various therapies were tried
based on the whims and fancies of the physician. Till the end of 18th
century, sweating cure was fairly popular. The fever was supposed to
reflect the presence of poisons in the body. Besides the ever popular
blood letting, induction of heavy sweating was considered the alternative
method for the excretion of body poisons. A common practice was to
heap lot of woollen clothes on the patient.
James Currie started cold water therapy as a treatment of fever in
1777. Traveling on a ship, the physician himself developed fever while
treating the sailors. When no other remedy was successful, he requested
a sailor to pour three buckets of ice-cold water on his naked body. Currie
felt an immediate relief. The headache and body pains were abated and
176 | History of Medicine

the body temperature felt normal. By evening, the febrile symptoms


returned and the same treatment was repeated with good results. From
that day onwards, Currie started treating all febrile conditions with ice-
cold cold water. He published a detailed account of the mode of treatment,
including the time of the day when it would be most effective. Cold
water was to be either poured on the naked body or administered as an
enema. He even claimed to have cured patients of pulmonary
consumption (tuberculosis) by cold water therapy alone. However, it
would be pertinent to add that in spite of the availability of antibiotics
as well as potent antipyretics, cold water immersion is one of the
emergency treatments recommended in cases of hyperpyrexia even today.
The bark of willow tree had been used as a folk medicine for the
treatment of aches and pains since the times immemorial. In 1838, the
active ingredient of the bark was isolated as salicylic acid. It came to be
widely used as a painkiller especially in cases of rheumatism. The drug
was effective but showed intolerable
side effects, especially severe gastric
irritation. One of the patients taking
salicylic acid for rheumatism was father
of a German chemist Felix Hoffmann
(Fig. 4.51). Hoffmann was working in
the famous German pharmaceutical
company, Bayer. The dutiful son took
upon himself the task of developing a
less toxic replacement. He came out
with the compound acetylsalicylic acid,
which was marketed by Bayer as
aspirin, in 1899. By that time the
physicians became intensely interested
in the possible deleterious effects of Fig. 4.51: Felix Hoffmann
19th Century Medicine | 177

fever on the human body. It was believed that fever, when sufficiently
high could coagulate the protoplasm of vital organs and hence it needed
most urgent treatment. Thus, the discovery of a safe antipyretic, aspirin,
started the antipyretic era of treatment. Aspirin was sold by the tons. It
began to be even smuggled into USA for sale at an exhorbitant price.
Gradually, with the acceptance of the germ theory of disease, it came to
be realized that decrease in body temperature does not cure a fever
unless the basic cause—bacterial or viral infection was tackled. Such a
remedy was available only when sulphonamides were discovered in
1930s and antibiotics in 1940s.
Aspirin remained the most effective antipyretic analgesic and
antiinflammatory agent till paracetamol was discovered in 1948 and
marketed in 1956 for the first time. Now paracetamol is considered the
safest antipyretic agent. Aspirin remains an important drug as an effective
painkiller. Recently, its use as a platelet antiaggregant has been widely
recommended for the prophylaxis and treatment of cerebral stroke and
myocardial infarction.
180 | History of Medicine

THE NOBEL PRIZE


The Nobel Prizes are inter-
national awards given yearly since
1901 for highest achievement in
physics, chemistry, physiology or
medicine, literature and for peace
(Fig. 5.1). The sixth Nobel Prize,
in economy, is being awarded since
1969 only. The Nobel Prize is
widely regarded as supreme
commendation in the world today.
However, Nobel Prizes are not
without controversy. For example Fig. 5.1: Nobel medal
the greatest apostle of peace,
Mahatma Gandhi, was never awarded a peace prize. In contrast, two
known merchants of death, Henry Kissinger (Vietnam War) and Nasser
Arafat (Palestinian Conflict) were
awarded Nobel Prize for peace!
The Nobel Prize is named after Alfred
Nobel, a Swedish chemist and an inventor
of dynamite. He endowed a 9 million
Dollar fund in his will. The interest of
this endowment was to be used as awards
for people whose work most benefited
the humanity. Each Nobel Prize is
constituted by a gold medal, a diploma
and a sum of money.
Alfred Nobel (1833-1896) (Fig. 5.2)
was born in Sweden. His father, Immanuel Fig. 5.2: Alfred Nobel
20th Century Medicine | 181

Nobel, was an architect. In 1859 Alfred and his younger brother started
work on a highly volatile material, nitroglycerine. In 1864 an explosion
in their factory killed the younger brother and several other people, but
the work continued. Alfred was successful in making a mixture of
nitroglycerine with a stable material which would explode only with a
detonator. Alfred named it dynamite and got it patented in 1867. It was
meant to be used by construction and mining companies but the biggest
orders came from the army. Sale of dynamite made Alfred an extremely
wealthy man. His other researches led to the development of artificial
rubber, silk and precious stones. He held a total of 355 patents in his
name.
Ironically, Alfred Nobel believed to have said that his discovery of
dynamite would lead to peace in the world. “My dynamite will sooner
lead to peace than a thousand world conventions.” he declared, “As
soon as men will find that in one instant, whole armies can be utterly
destroyed, they surely will abide by golden peace.” Alfred Nobel did
not live to see the devastations produced be the use of dynamite in
WW I and WW II. Actually, between 1867, when dynamite was invented
and 1896, when Alfred Nobel died, dynamite was extensively used in
conflicts all over the world and caused death of thousands of soldiers.
The feeling of guilt was further aggravated by the label of “Merchant of
Death” give by a French newspaper on the death of his younger brother
in the factory where dynamite was produced. Not wanting to go down
in history with such a horrible epitaph, Nobel wrote a will, leaving
almost all his wealth to establish awards for “those who, during the
preceding year, have conferred the greatest benefit on mankind.”
182 | History of Medicine

NOBEL PRIZE WINNERS


IN PHYSIOLOGY OR MEDICINE

Year Name Country Subject


1901 Emil A von Behring Germany Serum therapy
1902 Ronald Ross GB Malaria
1903 Niels R Finsen Denmark Treatment with light
radiation
1904 Ivan P Pavlov Russia Physiology of digestion
1905 Robert Koch Germany Tuberculosis
1906 Camillo Golgi Italy Structure of nervous
system
Santiago Ramon Y Cajal Spain
1907 Charles LA Laveran France Role of protozoans in
disease
1908 Elie Metchnikoff Russia Immunity
Paul Ehrlich Germany
1909 Emil T Kocher Switzerland Thyroid gland
1910 Albrecht Kossel Germany Proteins
1911 Allvar Gullstrand Sweden Dioptrics of the eye
1912 Alexis Carrel France Structure of blood
vessels
1913 Charles R Richet France Anaphylaxis
1914 Robert Barany Austria Vestibular apparatus
1915 —
1916 —
1917 —
1918 —
1919 Jules Bordet Belgium Immunity
1920 Schack A S Krogh Denmark Regulation of motor
mechanisms of
capillaries
Contd...
20th Century Medicine | 183

Contd...
Year Name Country Subject
1921 —
1922 Archibald V Hill GB Production of heat
during muscle
contraction
Otto F Meyerhof Germany Lactic acid production
in muscles
1923 Frederic G Banting Canada Discovery of insulin
John J R MacLeod Canada
1924 William Einthoven Netherlands Electrocardiogram
1925 —
1926 Johanners A G Fibiger Denmark Spiroptera carcinoma
1927 Julius Wagner-Jauregg Austria Treatment of cerebral
syphilis by malaria
1928 Charles J H Nicolle France The louse as the vector
of syphilis
1929 Christiaan Eijkman Netherlands Dietary-deficiency
diseases
Frederick G Hopkins GB Accessory food factors
1930 Karl Landsteiner Austria Human blood groups
1931 Otto Heinrich Germany Nature and mode of
Warbury action of the respi-
ratory enzyme
1932 Charles S Sherrington GB Function of neurons
Edgar D Adrian GB
1933 Thomas H Morgan USA Role of chromosomes
in heredity
1934 George Hoyt Whipple USA Liver treatment of
pernicious anemia
George R Minot USA
Wilham P Murphy USA
Contd...
184 | History of Medicine

Contd...
Year Name Country Subject
1935 Hans Spemann Germany Embryonic
development
1936 Henry H Dale GB Chemical transmission
of nerve impulse
Otto Loewi Austria
1937 Albert Szent-Gyorgyi Hungary Respiratory control by
Sino-aortic
chemoreceptors
1938 Corneill Jean Belgium Role of sinus and aortic
Francois Heymans mechanism in the
regulation of respiration
1939 Gerhard Domagk Germany Sulphonamide
prontosil
1940 —
1941 —
1942 —
1943 CP Henrik Dam Denmark Discovery of vitamin
K
Edwards A Doisy USA Chemical nature of
vitamin K
1944 Joseph Erlanger USA Function of single
nerve fibers
Herbert S Gasser USA
1945 Alexander Fleming GB Discovery of Penicillin
Ernst B Chain GB
Howard W Florey GB
1946 Hermann J Muller USA Use of X-rays to induce
genetic mutation
1947 Carl F Cori USA Carbohydrate
metabolism
Contd...
20th Century Medicine | 185

Contd...
Year Name Country Subject
Gerty T Cori USA
Bernado A Houssay Argentina Role of pituitary
hormone in sugar
metabolism
1948 Paul H Muller Switzerland Development of
insecticide DDT
1949 Walter R Hess Switzerland Functional mapping of
the brain
Antonio Egas Moniz Portugal Leucotomy for the
relief of schizophrenia
1950 Edward C Kendall USA Hormones of adrenal
cortex
Tadeus Reichstein Switzerland
Philip S Hench USA
1951 Max Theiler S. Africa Yellow fever and
vaccine for it
1952 Selman A Waksman USA Streptomycin
1953 Hans A Krebs GB Citric acid cycle
Fritz A Lipmann USA Molecular structure of
coenzyme A
1954 John F Enders USA Cultivation of polio
vaccine
Thomas H Weller USA
Frederick C Robbins USA
1955 Alex Hugo Theorell Sweden Oxidative enzymes
1956 Andre-Federic Counand USA Cardiac catheterization
Werner Frossmann Germany
Dickson W Richards USA
1957 Daniel Bovet Italy Antihistaminic drugs
and muscle relaxants
Contd...
186 | History of Medicine

Contd...
Year Name Country Subject
1958 George W Beadle USA Genetic mutations
Joshua Kornberg USA
1959 Servo Ochoa USA Biosynthesis of RNA
and DNA
Arthur Kornberg USA
1960 Frank Macfarlane Burnet Australia Acquired immunologi-
cal tolerance
Peter B Medawar GB
1961 Georg von Bekesy USA Analysis and
transmission of sounds
1962 Francis H Crick GB Structure of nucleic
acids
James D Watson USA
Maurice HF Wilkins GB
1963 John C Eccles Australia Biophysics of nerve
transmission
Alan L Hodgkin GB
Andrew F Huxley GB
1964 Konrad E Bloch USA Cholesterol and fatty
acid metabolism
Feodor Lynen Germany
1965 Francois Jakob France Control of gene action
Andre Lwoff France
1966 Francis Peyton Rous USA Tumor-inducing
viruses
Charles B Huggins USA Hormonal treatment of
cancer prostate
1967 Ragnar Granit Sweden Primary physiological
and chemical processes
in the eye
Contd...
20th Century Medicine | 187

Contd...
Year Name Country Subject
Haldan K Hartline USA
George Wald USA
1968 Robert W Holley USA The genetic code and
its role in protein
synthesis
Har Gobind Khorana USA
Marshall W Nirenberg USA
1969 Max Delbruck USA Gene replication and
viral genetics
Alfred D Hershey USA
1970 Bernard Katz GB Release of
neurotransmitters
Ulf von Frisch Sweden
Julius Axelrod USA
1971 Earl W Sutherland, Jr. USA Hormone action
1972 Gerald M Edelman USA Chemical structure of
antibiotics
Rodney R Porter GB
1973 Karl von Frisch Germany Individual and social
behavior patterns
Konrad Lorenz Austria
Nicolaas Tinber GB
1974 Albert Claude Belgium Structural and
functional organization
of cells
Christian de Duve Belgium
George E Palade USA
1975 David Baltimore USA Interaction of tumor
viruses and the genetic
material of the cells
Contd...
188 | History of Medicine

Contd...
Year Name Country Subject
Renato Dulbecco USA
Howard M Temin USA
1976 Baruch S Blumberg USA Origin and dissemi-
nation of infectious
diseases
D Carleton Gajdusek USA
1977 Roger Guillemin USA Peptide hormone
production by the
hypothalamus
Andrew V Schally USA
Rosalyn S Yalow USA
1978 Werner Arber Switzerland Restriction enzymes in
molecular biology
Daniel Nathans USA
Hamilton O Smith USA
1979 Alan M Cormack USA CAT-scan
Godfrey N Hounsfield GB
1980 Baruj Benacerraf USA Discovery of major
histocompatibility
complex
Jean Dausset France
George D Snell USA
1981 Roger W Sperry USA Research on cerebral
hemispheres
David H Hubel USA
Torsten Wiesel Sweden
1982 Sune K Bergstrom Sweden Discovery of
prostaglandins

Contd...
20th Century Medicine | 189

Contd...
Year Name Country Subject
Bengt I Samuelsson Sweden
John R Vane GB
1983 Barbara McClintock USA Mobile genetic
elements
1984 Niels K Jerne Denmark Production of
monoclonal antibodies
George J F Kohler Germany
Cesar Milstein GB
1985 Michael S Brown USA Regulation of
cholesterol metabolism
Joseph L Goldstein USA
1986 Stanley Cohen USA Growth factors
Rita Levi-Montalcini Italy
1987 Susumu Tonegawa Japan Antibody diversity
1988 James W Black GB Design of new drugs
Gertrude B Elion USA
George H Hitchings USA
1989 J Michael Bishop USA Retroviral oncogenes
Harold E Varmus USA
1990 Joseph E Murray USA Organ and cell
transplantation
E Donnall Thomas USA
1991 Erwin Neher Germany Discovery of ion
channels
Bert Sakmann Germany
1992 Edwin Krebs USA Regulation of biological
processes by phos-
phorylation of proteins
Edmund H Fischer USA
Contd...
190 | History of Medicine

Contd...
Year Name Country Subject
1993 Richards Roberts GB Split genes
Phillip Sharp USA
1994 Alfred Gilman USA G proteins
Martin Rodbell USA
1995 Edward B Lewis USA Genetic control of
embryonic development
Christiane Nusslein- Germany
Volhard
Eric F Weischaus USA
1996 Peter C Doherty Australia Role of MHC in
immune response
Rolf M Zinkernagel Switzerland
1997 Stanley B Prusiner USA Discovery of prion
1998 Robert F Furchgott USA Signaling property of
nitric oxide in CVS
Louis J Ignarro USA
Ferid Murad USA
1999 Gunter Blobel USA Signals of transport and
localisation of proteins
2000 Arvi Carlsson Sweden Neurotransmitters
Paul Greengard USA
Eric R Kandel USA
2001 Leland H Hartwell USA Regulation of cell cycle
R Timothy Hunt UK
Paul M Nurse UK
2002 Sydeny Brenner UK Genetic control of cell
death
H Robert Horvitz USA
John E Sulston UK
Contd...
20th Century Medicine | 191

Contd...
Year Name Country Subject
2003 Paul C Lauterbur USA Magnetic resonance
imaging
Peter Mansfield UK
2004 Linda B Buck US Odorant receptors
Richard Axel US
2005 Barry J Marshall Australia Discovery of
Helicobacter pylori
Robin Warren
2006 Andrew Z Fire USA Pioneering work on
molecular biology and
genetic information
Craig C Mello USA
2007 Mario R Capecchi USA Discoveries of princi-
ples of introducing
specific gene modi-
fications
Mortin J Evans UK
Oliver Smithies USA

AMERICAN RESEARCH WORKERS IN MEDICINE:


FROM ZERO TO HEROES
Since 18th century, Germans, French, and Italians were in forefront of
medical research. The reason was high class well-established traditions
of medical education in these countries. Scientists and research workers
were highly honored class of people. In contrast, the USA had business-
dominated atmosphere in medical education. The medical institutions
were blatantly commercial, understaffed, and offered medical degrees at
cut-price rates.
192 | History of Medicine

Actually, till 1880s anyone could call himself a doctor in the USA.
Typically, students joined a medical school only when they were found
unfit for any other money–making profession. The medical education
was a two-year affair, with students repeating in second year what they
had been taught in the first year. There was no opportunity for dissection
or to see patients. One of the more serious minded students has given
the following description of a medical school in the USA in 1847: Most
of the students were not even making a pretense of listening. Several
were unabashedly asleep. Mumbles and snickers in one part of the
classroom indicated the locale of a vulgar joke or a spicy story, whereas
thumps and squeals in another indicated a brewing storm.
In 19th century America, (even today?), greatness of a man was
judged by the amount of money he made. Professors of arts and sciences
were mildly esteemed but otherwise considered useless and impractical
persons. Neither salary nor prestige accorded to a university professor
was near to that in Europe. That is why any one really keen to learn
medicine went to one of the German speaking Universities of Europe.
This trend continued till the end of the World War I, after which, the
progress in the medical education in the USA gained momentum and
soon overshadowed that in the UK and Europe.
The point discussed above is made very clear by the number of
Nobel laureates in physiology or medicine in different quarters of 20th
century given below. It was only in 1933 that an American got Nobel
Prize in Physiology or Medicine for the first time since the institution
of the awards in 1901.
Number of American Nobel Laureates in Physiology or Medicine.
1901–1925 1926–1950 1951–1975 1976–2000
Nil 12 31 34
20th Century Medicine | 193

THE BROWN DOG AFFAIR


Animal experiments were a regular feature of medical research
throughout 18th and 19th century. Especially, advances in human
physiology were impossible without experiments on animals like rats,
guinea pigs, cats, dogs, and sheep. However, in England, concern with
animal welfare was expressed even in 17th and 18th century literature
by opposing activities like horse beating, bear-beating, cock-fighting
and similar sports. These activities were the primary target for the
organized animal protection movement. Thus a Society for the Prevention
of Cruelty to Animals was established in 1824 by the social reformer
Richard Martin. It received the Royal Charter in 1840, decades before
foundation of a society to prevent cruelty to children. According to a
wag, this showed that the British love their animals more than their
children.
No wonder, the movement against animal experimentation took
roots in 19th century England. It came to be known as antivivisection
campaign. These experiments came to the public notice when a French
physiologist, Magendie, held a public demonstration of an experiment
on an unesthetized dog in 1874. A prosecution of wanton cruelty was
brought against the French physiologist and the three British doctors
who had arranged the demonstration. The case fell through mainly because
the chief “culprit” was safely back in France. Further publicity to the
animal experiments was given by the publication of a book “Handbook
for the Physiological Laboratory,” by a professor of physiology in
London. Thus, on one hand, the animal experiments became a routine
part of medical curriculum in England. At the same time, the book
provided ammunition to the antivivisectionists. They became aware of
what was going on in the medical schools and laboratories. As a result of
widespread political debates, Cruelty to Animals Act was passed in
British Parliament in 1876.
194 | History of Medicine

The Brown Dog (Fig. 5.3) affair was


a controversy in England which lasted
from 1903 to 1910. It revolved around
anti-vivisection movement and a statue
erected in memory of a dog killed by a
physiologist, William Bayliss during the
demonstration of an experiment. The
Brown Dog affair provoked riots in
London, of the size which were never
seen again on any matter.
In 1903, Bayliss demonstrated the
famous discovery of the hormone
secretin on a conscious dog in the
University College London. Among the Fig. 5.3: Brown
students were two Swedish girls who dog monument
took detailed notes of the procedure. The Swedish medical students
were so shocked to see the experiment that they abandoned the idea of
pursuing the medical career. In addition, they handed over the written
notes to Stephen Coleridge, Chairman of the National Antivivisection
Society of England. Stephen publicly accused Bayliss of violating the
Cruelty to Animals Act. Bayliss won the case and was awarded substantial
amount as damages (which he donated to medical research). Even then,
the mass hysteria generated by the Brown Dog affair, as the legal battle
was mentioned in the press, refused to die down. A newspaper launched
a fund to cover the cost of litigation. Within four months, the collection
was about three times the amount awarded by the court as damages to
Bayliss. The balance was used by the anti-vivisection organizers to
raise a memorial to the dog mentioned in Bayliss versus Stephen Case.
The memorial was in the form of a drinking fountain for people and dogs
surmounted by a bronze of the dog in question. The statue bore the
following inscription:
20th Century Medicine | 195

“In Memory of the Brown Terrier Dog done to Death in the


Laboratories of University College in February 1903. — Also in Memory
of 232 dogs vivisected at the same place during the year 1902.—”
Medical students of a number of medical schools of London, outraged
at the inscription on the memorial, organized protests and attempts to
damage or destroy the statue. The feelings for and against the statue
were so strong that there were mass protests, riots, and civic disobedience.
Finally, the statue was removed in 1910.

WILHELM CONRAD ROENTGEN


Wilhelm Conrad Roentgen (1845–1923)
(Fig. 5.4) was a German physicist who
discovered X-rays in November, 1895 and
started the development of medical imaging
techniques. While working with a newly
invented primitive cathode ray generator
called Crooke’s tube, Roentgen suddenly
noticed a faint green light against the wall.
The odd thing he noticed was that the light
from the cathode ray generator was traveling
through materials such as books and wood.
Then he started putting various objects in
front of the generator and was amazed to Fig. 5.4: Wilhelm Conrad
see the outline of the bones of his hands Roentgen
displayed on the wall. Just two months after the initial discovery, he
published a paper entitled “On a New kind of Radiations.” He also gave
a demonstration before the Physical Medicine Society in December
1895 where the first X-ray picture (of his wife’s hand) was shown (Fig.
5.5). Since the nature of radiations was still unknown, Roentgen called
196 | History of Medicine

them X-rays. Much against his wishes, the


Physical Medicine Society named them as
Roentgen rays but probably because of
simplicity, the name X-rays continues to
be used worldwide.
The X-rays are one of the few medical
discoveries which received immediate
recognition. In January 1896, the German
Emperor had X-ray picture taken of his
crippled left arm to determine the nature
of the deformity. Queen Amebia of Portugal
got the feet and legs of her court ladies
X-rayed to demonstrate them the evil effect
of tight lacings. In any case, the use of
Fig. 5.5: The first
X-rays in the field of medicine was
X-ray picture
recognized and for this discovery Roentgen
was awarded the first Nobel Prize in Physics in 1901. Roentgen
donated the entire monetary reward, a part of the Nobel Prize, to the
university where he worked. Unlike most present day research workers,
Roentgen refused to patent his discovery, so that the it may benefit the
mankind.
The reports of the discovery of X-rays in the popular press produced
amusing results. X-rays were depicted as a kind of rays which could
reach where human eyes could not. It came to be believed that under
X-rays, a person would be seen as if disrobed. Leading articles were
written in the newspapers condemning the “revolting indecency of an
invention which would make privacy impossible.” In a London paper,
an advertisement appeared for the sale of “X-ray-proof undergarments”.
In the USA, a bill was introduced in the assembly of New Jersey to
prohibit the use of “X-ray opera glasses” in the theaters. Since the
20th Century Medicine | 197

dangers of exposure to radiations


were not known, and the primitive
apparatus could be easily assembled,
“X-ray studios” opened up for
amusing people by taking X-ray
pictures of their hands or feet (Fig.
5.6).
The next advancement in
medical imaging was the invention
of computed tomography (CT) in
1972, which was originally called
computed axial tomography (CAT).
Besides bones, even soft tissues and
blood vessels can be delineated with
a CT scan. A CT scan consists of Fig. 5.6: X-ray studio
taking a large series of two dimen-sional X-ray images around a single
axis of rotation. The computer can give a 3-D representation of the
tissue. For this invention, Godfrey Newbold Hounsfield, a British
engineer and Allan McLeod Cormack, a South African physicist shared
the Nobel Prize in Medicine, 1979.
A still better medical imaging method was the invention of Magnetic
Resonance Imaging (MRI) in 1985. Essentially, MRI turns hydrogen
atoms in the tissues into tiny radiotransmitters. Since ionizing radiations
are not used, MRI is far safer than CT scan. Moreover, it gives better
contrast between similar but not identical tissues. For this invention,
Paul Lauterbur, an American, and Peter Mansfield, a Briton, shared
the Nobel Prize for Medicine, 2003.
198 | History of Medicine

THE PIONEER NEUROHISTOLOGISTS


Camillo Golgi (1843-1926) (Fig. 5.7)
was a brilliant Italian histologist. He is
remembered for the development of a
special stain, now known as the Golgi
stain, using which, he made remarkable
discoveries into the histological
structure of various parts of the brain.
He and Santiago Cajal, another
contemporary neurohistologist, were
awarded Nobel Prize in Medicine,
1906. Both can be considered the
founders of modern neuroscience.
Histological techniques and stains
such as hematoxylin and carmine were Fig. 5.7: Camillo Golgi
available by the middle of 19th century. These stains were not
satisfactory for histological investigations on the nervous system.
Consequently, nervous system remained a totally unexplored territory
in histology. In the beginning of his medical career, Golgi was appointed
Chief Medical Officer in a small Hospital of Chronically Ill, near Milan.
In the seclusion of this hospital, he converted a little kitchen into a
rudimentary laboratory and started search for a new staining technique
capable of use in sections of nervous tissue. Within one year, in 1872, he
published a small note, “On the Structure of the Brain Grey Matter”,
wherein he described special technique of hardening the brain with
potassium dichromate followed by impregnation with silver nitrate.
The result was a stark black deposit on the entire neuron. For the first
time, the neuron cell body with all its dendrites and the axon could be
seen. This stain is still being used and called the Golgi stain or Golgi
impregnation technique. In 1875, Golgi was able to describe neural
20th Century Medicine | 199

structure of the olfactory bulb, and he was appointed as a professor of


histology in University of Pavla, Italy. In 1885, he published a beautifully
illustrated monograph on the histological structure of the central nervous
system. For the first time, Golgi was able to describe the morphological
structure of the glial cells and the relationship between the glial cell
processes and the blood vessels. He also described two types of neurons.
Golgi type I neurons were those with long axons (the projection neurons
in modern terminology). Golgi type II cells were those with axons
ramifying in the vicinity of the cell body (now called the interneurons).
Golgi also described the sensory receptors present in the tendons of the
muscles, now called the Golgi tendon organs.
In 1886-1892, Golgi made some fundamental contributions to the
knowledge of malaria. He was able to demonstrate the two types of
malarial parasite, causing the tertian and the quadrant types of fever. He
also demonstrated that the bout of fever coincided with the release of
the malarial parasites into the blood.
Golgi’s another significant contribution to histology was the
discovery of a cytoplasmic organelle, which he called the “internal
reticular apparatus”. The existence of
this structure was doubted by other
workers, who considered it a staining
artifact. About five decades later
(mid-1950s), it was confirmed by
electron microscopy, and named the
Golgi apparatus.
Santiago Ramon Y Cajal
(1852-1934) (Fig. 5.8) a Spanish, was
another brilliant neurohistologist of
that era. He was the son of a professor
of applied anatomy. In his childhood, Fig. 5.8: Santiago
he was wild and stubborn. Because Ramon Y Cajal
200 | History of Medicine

of his poor behavior, he had to change many schools. He even became an


apprentice to a shoe-maker and later a barber. Ultimately, better sense
prevailed and he agreed to join a medical school.
Cajal used Golgi’s method of staining and described in detail almost
every aspect of the nervous system. These descriptions in his book
“Histology,” published in 1909, were so accurate that the book is still
used as an authoritative reference book in all the laboratories of
neuroscience. In contrast to the Reticular Theory (all the neurons form
a syncytial network) accepted by his contemporaries including Golgi,
Cajal presented the Neuron Doctrine, stating that the neuron is an
anatomic, physiologic and metabolic unit of nervous system. The
problem could be resolved much later and the idea of Cajal was found
correct. He devised a new staining technique of reduced silver nitrate
method and studied in detail the processes of degeneration and
regeneration in the peripheral nervous system. The results were published
in another classical book, “Degeneration and Regeneration,” in 1934.
Microphotography was not well-developed in those days. A drawing
of the observed image was the only ways of illustration. Some of the
drawings of Cajal were so intricate that they were regarded as the artistic
interpretation rather than the accurate copy. Now, when better methods
are available, reexamination of Cajal’s illustrations has shown the accuracy
of his descriptions. Cajal used the same histological methods as his
contemporaries, but he could observe what others could not. This was
the genius of Cajal.

MARIE CURIE
Marie Curie (1867–1934) (Fig. 5.9) was a Polish chemist who may be
considered the mother of radiotherapy as well as the atomic bomb. She
is one of the two persons who have received Nobel Prize twice, and was
20th Century Medicine | 201

the first woman Nobel laureate. She


founded the Curie Institutes in Paris and
Warsaw, the pioneer institutes engaged in
radiotherapy.
Curie was born in Warsaw, Poland,
then a part of Russian Empire and named
Maria Sklodowska. Due to her gender and
Anti-Polish policies of authorities, she
was denied admission to any university in
Poland. Since the family was not well off,
she worked as governess for several years
and financially supported the medical
education of her elder sister, Bronia, in
Paris. Once Bronia established herself as a
physician, Maria also shifted to Paris for Fig. 5.9: Marie Curie
higher education in chemistry and physics.
In Paris she met and married her tutor, Pierre Curie. To facilitate
her acceptance in the adopted country, she changed her name to Marie
Curie. The husband-wife team started work on radioactive materials and
discovered highly radioactive element radium and determined their atomic
weight and properties. For this work, they shared Nobel Prize in
Physics, 1903, with another pioneer in radioactivity, Becquerel. The
term radioactive was coined by Marie.
The laboratory where the Nobel laureates worked was more of a
shed where, in Winter, the night temperature fell to around six degrees.
As one chemist commented “it looked more like a stable or a potato
cellar.” Ignorant of the dangers of radioactivity they were exposed to,
Marie admitted: “One of the pleasures was to enter our workshop at
night; then around us, we would see the luminous silhouettes of beakers
and capsules that contained our products.” Despite their financial
difficulties, the Curies refused to file a patent application for their
202 | History of Medicine

discovery. For them, the priority was to enable any scientist, French or
a foreigner, to find medical applications of the discovery.
Pierre Curie tested radium on his skin. It caused a burn and then a
wound. Its tissue destructive property was thus proved. That experience
started the use of radium in the treatment of malignant tumors:
radiotherapy was born. In 1906, Pierre Curie weakened by exposure to
radiations was run over by a horse-driven carriage. Widowed, Marie
Curie continued her work on radioactive elements and received another
Nobel Prize in 1911, this time in chemistry.
During World War I, Madam Curie, as she came to be known,
played active role in providing mobile X-ray machines to the troops.
Thus, in the wounded, the bullets and the shrapnel could be localized
and treated at the earliest.
Due to prolonged exposure to radiations Madam Curie’s health
began to decline in late 1920s. Her vision became very poor because of
cataract. Ultimately, she died of aplastic anemia in 1934. Her daughter;
Irene Joliot Curie also became an active research worker in radioactivity.
She won a Nobel Prize for chemistry in 1935. She died of leukemia in
1956, presumably because of prolonged exposure to ionizing radiations.

Quotations of Marie Curie


• All my life through, the new sights of nature made me rejoice like a
child.
• One never notices what has been done; one can only see what
remains to be done.
• You cannot hope to build a better world without improving the
individuals. To that end, each one of us must work for his own
improvement.
• I was taught that the way of progress is neither swift nor easy.
• Nothing in life is to be feared. It is only to be understood.
• Be less curious about people and more about ideas.
20th Century Medicine | 203

INVENTION OF SPHYGMOMANOMETER
Hypertension must be a disorder as old as mankind. At present about
one billion people are believed to suffer from hypertension. However it
was recognized as a disease only in early decades of the 20th century
when the sphygmomanometer came to be used routinely in clinical
medicine. Till the middle of 20th century, there was no effective remedy
against hypertension. Roosevelt, President of the USA as well as Joseph
Stalin, dictator of the Soviet Union, both died of the complications of
untreated hypertension.
The existence of blood pressure must be apparent by the time of
Hippocrates from the way blood spurted out of slashed arteries, but no
one found a way to measure it. The blood pressure was measured for the
first time in 1733 when Stephen Hales, a clergyman interested in
biology, inserted a brass tube into the artery of an unanesthetized horse
and measured the height to which the column of blood rose in the glass
tube connected to the brass tube. He recorded blood pressure of the
horse as eight feet and three inches (See Fig. 23.2). This invasive method
obviously could not be used to record the human blood pressure. About
100 years later, in 1847, Card Ludwig inserted a cannula into the artery
of a man and recorded the pressure on a rotating drum called Kymograph.
A noninvasive method to measure the human blood pressure was still
not available.
In 1881, a German physician Samuel von Bausch developed a
mercury manometer attached to a water-filled bag. The bag was pressed
against the brachial artery and the pressure at which the pulse disappeared
was noted as the (systolic) blood pressure. Direct measurement of
blood pressure by arterial catheterization confirmed the accuracy of this
noninvasive method. Even then, it was not put to any use because it was
considered too cumbersome. Moreover the physicians did not see any
need for such a measurement. Further refinement in the
204 | History of Medicine

sphygmomanometers was give by an Italian surgeon, Riva-Rocci in


1896. He created an inflatable rubber cuff connected to a mercury
manometer. The cuff was tied on the upper arm and inflated till the
radial pulse disappeared. This method, though simple, could record
systolic blood pressure only. Moreover, since the cuff was only five cm
wide, the results were not accurate enough. Subsequently, von
Recklinghausen recognized the problem in 1903 and widened the cuff
from 5 to 13 cm, though the modified instrument continued to be known
as Riva-Rocci sphygmomanometer (Fig. 5.10). Even then the physicians
of Europe did not find any merit in the use of blood pressure
measurement. A fundamental role in the spread of the use of the
instrument was played by the American surgeon, Harvey Cushing. In
1901, on a visit to Italy, he came across the Riva-Rocci instrument. He
was the first to grasp the significance of the invention. He started the
routine use of the instrument during anesthesia, and found it a valuable
tool to reduce the mortality in surgical patients.

Fig. 5.10: Riva-Rocci sphygmomanometer


20th Century Medicine | 205

Further refinement in the record of human blood pressure was given


by a Russian surgeon, Nikolai Korotkoff. He was the first to observe
the sounds over the brachial artery at certain points during deflation of
the cuff of sphygmomanometer. By this method, both the systolic and
diastolic blood pressures could be accurately measured. The obser-
vation was reported in 1905 in a brief paper, 207 words long, in Russian
language. The world came to know about it only in 1941, when it was
translated into English. By this time the diagnosis of hypertension
began to be made but its implications were still not fully appreciated.
During 1950s the American insurance companies recognized that people
with higher blood pressure tend to die at a younger age than individuals
with lower blood pressure. The first epidemiological studies of
hypertension, conducted on behalf of the insurance companies, revealed
significant correlation between high blood pressure and mortality from
disorders of the heart, brain and kidneys. In late 1940s, search had begun
for a safe and effective antihypertensive drug but none was in sight.
Sympatholytic drugs available at that time could reduce the blood
pressure, but because of their side effects, very few patients would
comply with the prescribed medication. At this juncture, Dr Rustom Jal
Vakil, a cardiologist practicing in Bombay came up with an Ayurvedic
antihypertensive, Rauwolfia serpentinea.

HARVEY WILLIAMS CUSHING


Harvey Williams Cushing (1869-1939) (Fig. 5.11) was an American
neurosurgeon and a pioneer in brain surgery. He is considered by many
the greatest neurosurgeon of the 20th century. His name, known to
every medical student, is immortalizes in medical books by a number of
diseases named after him, e.g. Cushing syndrome, Cushing disease,
Cushing reflex, and Cushing ulcer. Cushing made significant contribution
206 | History of Medicine

to the subject of pathology with his


work on the tumors of the pituitary
gland and brain.
When he entered the Harvard
Medical School, in 1891, he was
directed by his father to abstain from
smoking, drinking and other forms of
intemperance like boating and baseball!
How far Cushing followed his father’s
instructions is not known. During his
days as a medical student, ether
anesthesia was in vogue and usually
the medical students were called upon Fig. 5.11: Harvey
to administer it in the operation theater. Williams Cushing
One patient anesthetized by Cushing
died during the operation. This accident disturbed Cushing to such an
extent that he thought of quitting the medical school. Discussing with
one of his class fellows, Earnest Amory Codman, he came to the
conclusion that the tragedy might have been prevented if the vital signs
of the patient had been regularly monitored during the operation.
Thereupon, the two medical students developed a chart for recording
the pulse and respiration of all patients under anesthesia. This “ether
chart,” as it came to be known, worked well and revolutionized the
anesthesia practices. This innovation was to considerably reduce the
intraoperative mortality. Even as students, Cushing and Codman also
realized the clinical importance of X-rays. X-rays were discovered in
1895, and in 1897, they reported two cases of gun-shot wounds in the
spine, in which the bullets were localized through the use of X-rays.
In 1896, Cushing joined the John Hopkins Hospital in Baltimore as
an assistant resident in surgery. His supervisor was the legendry Williams
Halsted, the father of American surgery, known for his meticulous surgical
20th Century Medicine | 207

techniques. Cushing spent more than one year in Europe, visiting well-
established surgeons like Theodor Kocher, as well as research workers
like Charles Sherrington. In Italy, he was introduced to the Riva–Rocci
instrument for measurement of blood pressure. On his return, he
introduced the instrument in the USA. As a result, the regular measurement
of blood pressure was added to the ether chart.
One of the major difficulties in the brain surgery is the problem of
hemorrhage, since the scalp as well as the brain and brain tumors are
highly vascular. Before Harvey, brain surgery led to almost 100%
mortality. Harvey Cushing’s most notable contribution in the
development of neurosurgery was the prevention of blood loss using a
special clip (Cushing clip) and the electrocoagulation of the bleeding
vessels. In 1909, Cushing carried out the first operation for the treatment
of acromegaly. Between 1909 and 1911, Cushing operated on 46 cases
with pituitary disorders. In all, Cushing operated on more than 2000
verified cases of brain tumors. He brought the mortality in neurosurgical
operations below 10%.
The 1920s were a particularly fruitful period for Cushing. His
clinical output was prodigious, and he trained a series of remarkable
neurosurgeons, both from the USA and Europe, who, through their
trainees, spread Harvey’s neurological techniques throughout the world.
Due to the zeal of Harvey Cushing, neurosurgery became a separate
discipline of surgery.

Quotations of Harvey Williams Cushing


• The capacity of a man is revealed when, under stress and
responsibility, he breaks through his educational shell, and he may
then be splendid surprise to himself, no less than to his teachers.
• Standardization of our educational systems is apt to stamp out
individualism and defeat the very purpose of education by leveling
the product down than up.
208 | History of Medicine

• I would like to see the day when somebody would be appointed


surgeon who has no hands, for the operative part is the least part of
his work.
• A physician is obligated to consider more than a diseased organ,
more even than the whole man—he must view the man in his world.
• In these days when science is clearly in the saddle and when our
knowledge of disease is advancing at a breathless pace, we are apt to
forget that not all can ride. He also serves who waits and applies
what the horseman discovers.

IVAN PETROVICH PAVLOV


Ivan Petrovich Pavlov (1849–1936)
(Fig. 5.12) was a Russian physio-
logist, psychologist and a physician,
who is best remembered for his
studies on the physiology of diges-
tion and the concept of uncondi-
tioned and conditioned reflex
actions. For this work he received
Nobel Prize in Medicine in 1904.
Pavlov was the son of a poor
village priest. He started his higher
education to be a priest but aban-
doned it in favor of natural sciences. Fig. 5.12: Ivan Petrovich Pavlov
Even as a student, in collaboration with a class fellow, he published a
paper on the physiology of pancreatic nerves. On graduation, he was
awarded a gold medal. During this period he became so much interested
in physiology that he took admission in a medical academy and passed
the medical examination in 1879, again with a gold medal. In 1883, he
presented his doctor’s thesis on the reflex control of the heart, entitled
“Centrifugal Innervation of the heart.”
20th Century Medicine | 209

After 1890, for the rest of his life, Pavlov’s chief interest was
gastrointestinal physiology. He developed special type of gastric fistula,
name after him as Pavlov’s pouch. With such a pouch, the physiology
of gastric digestion could be experimentally studied, on long-term basis,
without disturbing the normal digestive processes. This procedure was
a breakthrough because earlier only acute vivisection was used for
experimental studies on digestion. By this method, Pavlov demonstrated
a cephalic (neural) and a gastric (chemical) phase of gastric secretion.
For this work he was awarded the Nobel Prize in 1904.
Pavlov is also known to the general public as a research worker in
experimental psychology and psychopathology. Anecdotal evidence
suggests that he became interested in these studied when he observed a
dog drooling whenever the laboratory assistant approached it. Soon he
realized that the dog was reacting to the white coat. The laboratory
assistant serving it food was always in a white lab coat. Pavlov concluded
that, in the animal’s brain, the food and white coat have become associated.
Therefore, the dog reacted to the white coat as if food was on its way.
Such conditioned responses could be demonstrated using a variety of
sensory stimuli like sound of a bell, whistle, metronome, tuning fork and
a range of visual stimuli. All these articles can be seen even today,
because the Russian Government has carefully preserved the lab where
Pavlov worked for major part of his life. Pavlov thus established what
he called conditioned reflexes (now named conditioned responses).
Pavlov was an excellent operator who was compulsive about his
working hours and habits. He would sit down for lunch at exactly 12
O’clock and go to bed exactly at same time every evening. The dogs
were served food at exactly same time every day. He would go on
vacation on the same date every year! Politically, he was opposed to the
Russian Revolution of 1917, and wanted to leave Russia. However, in
view of his international name and fame, he was persuaded to stay. A
decree personally signed by Lenin assured him of full support in his
scientific work.
210 | History of Medicine

Quotes of Ivan Petrovich Pavlov


• Science demands from a man all his life. If you had two lives, that
would not be enough for you.
• While you are experimenting, do not remain content with the surface
of things. Don’t become a recorder of facts, but try to penetrate the
mystery of their origin.
• Perfect as the wings of a bird may be, it can never enable the bird to
fly if unsupported by the air. Facts are the air of science. Without
them, a man of science can never rise.

HISTORY OF BLOOD TRANSFUSION


The first anecdotal account of blood transfusion is as follows: In 1492,
the Pope Innocent VI was seriously ill and the physician advised blood
transfusion as a treatment. Blood was removed from three young boys
and infused into a vein of the patient. The boys were paid one ducat
each. The Pope showed no improvement in his health but all the three
boys died of blood loss.
In the year 1665, a British physician, Richard Lower, tried transfusion
of blood from one dog to another. Most of the dogs survived. Next, a
French physician by the name of Dr Jean-Baptise Denys transfused a
lamb’s blood into a middle-aged man suffering from mad rages. He hoped
that the blood transfusion would make the patient as placid as the lamb!
As can be expected, the patient died of severe hemolytic reaction and
the physician was charged for murder. Soon after that the practice of
transfusing animal blood into humans was declared illegal. For the next
150 years blood transfusion was not attempted by any one. In 1818, Dr
James Blundell, a British obstetrician performed the first successful
human- to-human blood transfusion. At first, he removed four ounces of
blood from a man and transfused it to his wife who was suffering from
20th Century Medicine | 211

severe blood loss. The patient survived. Encouraged, Dr Blundell


performed blood transfusions in another ten of his patients. Only five
survived. He invented many instruments for blood transfusion and made
as much as 50 million US dollars from this endeavor. It became clear that
about half the number of patients receiving blood transfusion suffered
adverse reactions, sometimes fatal. The breakthrough came in 1901
when Karl Landsteiner, an Austrian scientist discovered the blood
groups A, B and O. In 1902, two students working with Landsteiner
discovered blood group AB. Till 1914, the discovery of blood groups
was given little attention. Blood transfusion could not become popular
because the blood removed from the donor used to clot before it could be
transfused in the patient. The difficulty was solved in 1914 when citrate-
glucose was introduces as an anticoagulant by Richard Lewisohn (1875–
1961), a surgeon in the United States. During the World War I, transfusion
of compatible blood was first performed on a large scale and saved many
lives. The value of blood transfusion technology was further enhanced
when the practice of storing blood in the refrigerator was initiated. The
term “Blood Bank” was used in 1937 for the first time. Karl Landsteiner
was given the Nobel Prize in medicine in 1930 for the discovery of
blood groups.
Although the practice of intragroup blood transfusion was being
followed, transfusion reactions could not be eliminated. It was believed
that the adverse reactions were caused by other blood group systems
not yet discovered. This possibility was proposed by Philip Levine,
an American physician 1939. A woman, who had delivered a dead fetus,
and suffered severe postpartum hemorrhage, was given her husband’s
blood. Even though both had blood group O, the lady suffered incompati-
bility reaction. The cause of mismatch, Rh blood group was discovered
by Karl Landsteiner and Alexander Weiner in 1940. (Landsteiner
became an American citizen in 1929).
212 | History of Medicine

20TH CENTURY BUSINESSMEN-SURGEONS


By the year 1900, aseptic surgery and anesthesia were firmly established.
These developments made surgery a roaring business proposition. All
over Europe and the USA, surgery was performed in abundance.
Appendicectomies, cholecystectomies, prostate surgery, small gut
operations and even thyroid surgery became all too common. Thousands
of appendixes were removed in 1920s and 1930s after labeling a case of
pain abdomen as chronic appendicitis. Operations were devised to “fix”
the abdominal viscera found “displaced” by radiological examination.
Hundreds and thousands of tonsillectomies were performed, mostly,
unnecessarily. Hysterectomy became a “fashionable” treatment. While
asking the past history of a patient, one of the questions was: “Have
you got your uterus, appendix, or tonsils removed?”
Constipation has been considered a serious disorder since the time
immemorial. In the late 19th century, physicians developed the theory
of self-poisoning due to constipation. “A constipated person, by retaining
his wastes is always working towards his own destruction; he makes
suicide by intoxication,” declared a well-known physician Bouchard.
Disorders ranging from indigestion, nervousness, insomnia to impotence
were attributed to constipation. Various methods to relieve constipation
ranged from enema, colonic irrigation, rectal dilators and electrical
stimulators to surgical removal of yards of small and large gut. The last
treatment obviously was favored by (businessmen) surgeons. According
to some of such surgeons, complaints such as periodic nasal congestion
in the females were the result of a “reflex” response to the presence of a
“pelvic trouble” like malposition of the uterus or ovaries. Gynecologists
made themselves rich by performing pelvic surgeries to correct the
“malformations.”
20th Century Medicine | 213

Thus, by 1920s the businessmen-surgeons became extremely rich


as well as powerful. The image of a surgeon in the society in 20th
century was very different from the image of a barber-surgeon, who was
far below the status of a physician on the social scale.

THE GOLDEN ERA OF SURGERY


By the middle of the 20th century, aseptic surgical techniques, anesthesia,
blood transfusion and antibiotics had been discovered. These discoveries
led to the onset of the golden era of surgery. By this time the surgery on
the gastrointestinal tract, genitourinary tract, breast, thyroid, etc. had
become very safe and almost routine surgical procedures. Still newer
surgical techniques led to the establishment of new surgical specialties
like cardiac surgery, thoracic surgery, pediatric surgery, neurosurgery,
cosmetic (plastic) surgery, joint replacement surgery and finally the
key-hole surgery.
The first operation for mitral stenosis was performed in 1925, but
operations for aortic stenosis started only in late 1940s. In 1944, Helen
Brooke Taussig, an American pediatrician and Alfred Blalock, a cardiac
surgeon performed the first surgical operation for a blue-baby. However,
the cardiac surgery was still not safe. The operation often produced
severe brain damage due to lack of blood supply to the brain during
surgery. Subsequently, by 1952 open heart surgery became safe with
the use of heart-lung machines and hypothermia. Soon, artificial valves
also began to be implanted in the heart.
Organ Transplant Surgery is the latest advance in surgical techniques.
With the discovery of immune responses and the mechanism of the
graft-rejection as well as the immuno-suppressant drugs, organ
transplants became feasible. Kidney transplantation was the first success-
story. The patient with a transplanted kidney has many years of active
fruitful life. Similarly corneal transplants have restored vision in thousands
214 | History of Medicine

of blind patients. Bone marrow transplant has saved hundreds of lives.


The first heart transplantation was a far more dramatic event.
On third December, 1967, a South African surgeon, Dr Christian
Bernard, performed the world’s first heart transplant operation and
became a world-famous heart surgeon. The patient lived only 18 days
after the surgery. However his second heart transplant case lived 593
days. Within weeks of Bernard’s achievement, almost every heart surgeon
jumped on the bandwagon more for the sake of self-glorification than
patient’s benefit. For example, a patient who underwent heart transplant
at the cost of 29,000 US dollars lived only 16 days, mostly in the
intensive care unit of the hospital. The initial hullabaloo was followed
by a slow realization that all the expenditure, though benefited the
surgeon, was of little use to the patient. Dr Hellen Taussig, the American
pediatric cardiologist, is one of the most vocal critics of the cardiac
transplant surgery. According to her, “any new drug would be promptly
withdrawn from the market if it results in 50 percent mortality. The
mortality in heart transplant surgery is still higher and the postoperative
life of most of the survivors is very limited. So there seems to be no
justification for such operations.” Similar argument holds true for the
lung and liver transplants, though such operations help the surgeons to
make headlines in the international press.

RONALD ROSS
Sir Ronald Ross (1857–1932) (Fig. 5.13) was a British army physician
who discovered the mode of spread of malarial parasite by the Anopheles
mosquito. For this work he was awarded Nobel Prize in medicine in
1902. The significance of this discovery would be better appreciated
after perusal of the history of malaria given in chapter 84.
Ross was born in Almora, India, where his father, General Sir CC
Ross of the British army was posted. Ronald Ross graduated in medicine
20th Century Medicine | 215

from a medical school in London. His


studies on malaria started in 1881 when
he was posted in Presidency General
Hospital, Calcutta. Ultimately, in 1897,
in Secundrabad, Ross was able to
demonstrate the presence of malarial
parasite within the Anopheles mosquito.
Using birds that were sick with malaria,
he was soon able to ascertain the entire
life cycle of the malarial parasite including
its presence in the mosquito’s salivary
glands. For this remarkable work Ross
was awarded the Nobel Prize in
Medicine. Ronald Ross showed a life-
Fig. 5.13: Ronald Ross
long devotion to the task of eradication
of malaria in different countries. He carried out surveys and initiated
schemes for the prevention of malaria in India, Ceylon, West Africa,
Suez canal zone, Greece, Mauritius, Cyprus and in many areas affected
by the WW I. Ross also made an another significant contribution to
medicine by developing a mathematical model for the study of the
epidemiology of malaria.

FROM MAL’ARIA TO MALARIA


Intermittent fever, the hallmark of malaria has been recorded since ancient
times. Reference to such a fever can be found in the Vedic writings of
1600 BC. Charak and Sushruta gave vivid description of the fever and
even mentioned its association with insect bites. Hippocrates (300 BC)
also described the intermittent fever and its deadly effect. The term
mal’aria (meaning bad air) was used for the first time in the 18th century
when the association of such fevers with “poisonous vapors” of the
216 | History of Medicine

swamps was noted. The other names for such fevers were jungle fever,
marsh fever, and paludal fever. The term malaria, without the
apostrophe, came to be used only in the 20th century.
The history of malaria can be linked to the history of mankind.
Greatest of the great warriors have fallen victim to the disease. Even the
downfall of the great Roman Empire in 4-5 century AD has been
attributed to malaria. Alexander the Great, the conqueror of the entire
world fell victim to the bite of a mosquito at the age of 33. In many wars,
armies were defeated by the mosquito bites rather than the arrows or
guns of the enemy. During the American civil war, in 1861-1865, malaria
accounted for 10,000 deaths in the army. The French campaign in
Madagascar in 1895 saw 13 deaths in action and 4000 deaths due to
malaria. In World War I, about 80 percent of the French troops were
hospitalized with malaria, and 25,000 British troops were sent home
from the war zone since they were suffering from chronic malaria. In
malaria-prone regions of the world, malaria accounted for more military
deaths than the bullet in both World War I and World War II. Even today,
350–500 million people suffer from malaria every year mainly in Africa,
south of Sahara, but it is prevalent in many south–east Asian countries
also. Malaria accounts for one million deaths per year, 90 percent of
which occur in Africa.
The malarial parasite was first discovered in the blood of a patient
by a French army surgeon, Charles Laveran, in 1880, when posted in
Algeria. For this discovery, Laveran was awarded the Nobel Prize in
medicine in 1907.
The treatment of malaria was known much before the cause of
malaria was discovered. Even before 1600s the local population of Peru
(South America) used infusion of the bark of a tree, called fever tree, in
the treatment of certain type of fevers. In 1638, Countess of Chinchon,
wife of the Spanish Viceroy of Peru fell desperately ill with an intermittent
fever (malaria). She was saved by infusion of bark of the “fever tree.”
20th Century Medicine | 217

The incident made the tree famous among the Europeans residing in
Peru. In 1740, Catholic priests (Jesuits) brought the bark from Peru and
used it to treat fevers in Spain. They named it Cinchona bark. The
efficacy of the Cinchona bark was proved beyond doubt when it cured
both King Charles II of England and a son of King Louis XIV of France,
who suffered from a malarial fever. After the cure, the cinchona bark
became extremely popular and so expensive that only the very rich
could afford it.
Once the cure for the deadly disease, malaria, was discovered, the
demand of cinchona bark soared. Over 25,000 cinchona trees were cut
every year and by 1795, the tree became almost extinct. Consequently,
Peruvian officers stopped export of the tree. In 1820, two French
scientists Caventou and Pelletier isolated the active ingredient of
cinchona bark and named it quinine. The importance of this discovery
can be judged from the fact that a monument has been erected in their
honor in Paris. In the beginning of 20th century, quinine was the only
drug known for a specific disease. Almost all other drugs used in the
previous 2,000 years were ultimately found to be useless and discarded
by the middle of 20th century. To cope up with the rising demand of
quinine and a decline in export from Peru, the Dutch government was
instrumental in smuggling of a pound of cinchona seeds in 1865, which
were planted in Java (now a part of Indonesia). Thus the Dutch
monopolized the trade in quinine for next almost 100 years.
In World War I, Java was taken over by Japan and the world was
deprived of quinine. Hence search for an alternative drug for the treatment
of malaria was launched just after the end of World War I. In 1934, the
Germans developed a synthetic antimalarial and named it Resochin but
it was not very popular. World War II was again a time of big demand of
quinine. Over 60,000 US troops died of malaria. The reason was
nonavailability of quinine since, Java, the chief source of quinine at that
time was again occupied by Japan in 1942. At the same time the Germans
218 | History of Medicine

took over Amsterdam, where the cinchona trees were also planted. Thus
began the race for an alternative antimalarial drug. By 1944, the factory
manufacturing Resochin fell into the hands of allied troops. The
Americans slightly modified the structure of Resochin and sold it under
the name of chloroquine. Subsequently, many other antimalarials came
in the market.
The discovery of DDT in 1939 by Paul Muller as an insecticide
had an un-intended benefit during the World War II. During the World
War I, thousands of soldiers had died because of lice-borne infections
like trench fever. The disease was rampant among the soldiers because
among those engaged in trench warfare, any form of personal hygiene
was impossible. At the beginning of World War II, it was feared that the
history would be repeated. In the British troops, it was made
compulsory to wear DDT impregnated underwear. The underwear
remained free of lice for many weeks. Thus thousands of Allied troops
were saved from the jaws of death by DDT. The “secret weapon” was
not leaked to the Germans, whose troops continued to die of the trench
fever. Paul Muller was awarded Nobel Prize in medicine, 1948. By
this time the benefit of DDT spray in control of malaria was realized.
All over the world, DDT spray was adopted as a means of malaria
eradication program. In the western world, malaria could be controlled.
However in most of the developing countries, a sharp decline in the
incidence of malaria was followed by the development of DDT-resistant
mosquitoes. The incidence of malaria has reached almost same rate as in
the pre-DDT era. The WHO has also abandoned the malaria eradication
program.

INFLUENZA PANDEMIC 1918


An influenza pandemic swept across the world in 1918. It affected
people in every continent except the Antarctic. It lasted only about one
20th Century Medicine | 219

year, but the infection rate was very high. About 50% of the people
were infected and it killed between 50–100 million people worldwide.
The death rate of about 2.5–5% during this epidemic was in contrast to
only 0.1% mortality during other influenza epidemics. Another peculiar
feature of the 1918 influenza epidemic (Fig. 5.14) was that it killed
mostly the young adults. In the previous and subsequent influenza
epidemics, mortality was mostly restricted to the very young (below 2
years age) or very old age (above 70). It was known as the Spanish flu
since the cases were first reported in Spain. Eight million people were
affected in Spain alone. Most of the deaths occurred in Asia and the
highest percentage of population was killed in India. About 5% of the
Indian population died due to the epidemic. Among the Indian troops
about 22% of those who caught the disease died.

Fig. 5.14: Influenza epidemic 1918


220 | History of Medicine

The disease started as cough, headache, backache, fatigue, high fever


and difficult breathing. Within three days of the onset of the symptoms,
the patients died of acute pulmonary edema. In some cases, death
occurred within 24 hours. One of the most striking complications was
hemorrhage from mucous membranes especially nose, stomach,
intestines, but the majority died from acute pulmonary edema. Those
were the last few months of the World War I, when 19 nations were at
war. The disruption, stress and privation of war aided the transmission
of flu. About 57,000 American soldiers died of influenza, whereas 53,000
American soldiers died in the battle.
The tragedy was that doctors could do nothing. In spite of so many
advances in medicine, the actual therapeutic situation was no better than
in the 14th-century plague epidemic. Doctors tried every possible
measure they could think of: quinine tablets, bleeding, castor oil, digitalis,
morphine, enema, aspirin, tobacco, hot baths, cold baths, iron tonics,
and expectorants of pine tar! But nothing worked.

AUGUST KROGH
Schack August Steenberg Krogh (1874-1949) (Fig. 5.15) was a Danish
professor of zoophysiology, who was awarded the Nobel Prize in
Medicine, 1920, for his work on the regulation of blood flow to the
skeletal muscle. He is one of the few non-medical scientists who have
been awarded Nobel Prize in medicine.
His interest in zoophysiology was aroused after attending a lecture
by a prominent human physiologist, Christian Bohr. Krogh joined the
department headed by Bohr in 1897, as an assistant. Bohr soon became
aware that Krogh had a natural aptitude for laboratory work. Krogh set
up his experiments in a simple way and himself constructed the necessary
20th Century Medicine | 221

equipment with an extraordinary skill. So,


Bohr gave him a free hand in his laboratory.
Together with Bohr, Krogh published a
paper in 1904, demonstrating that carbon
dioxide reduces the capacity of hemoglobin
to hold oxygen. Thus they could explain
the mechanism of the release of oxygen in
the tissues. The phenomenon became
known as the Bohr Effect.
The collaboration between Bohr and
Krogh thus began well, but it ended in
disaster. Earlier, Bohr had done experiments
showing that the oxygen pressure of arterial
blood was greater than the oxygen pressure
Fig. 5.15: Schack
in the alveolar air in the lungs. He therefore
August Steenberg Krogh
declared that oxygen is transported from
the alveolar air into blood by an active transport mechanism (secretion).
Krogh did not believe this. He devised a microtonometer and
demonstrated that the oxygen pressures in the alveolar air and arterial
blood were similar. It showed that the oxygen transport in the lungs was
by simple diffusion. In April,1907, Krogh demonstrated his experiment
to his mentor. Bohr could not tolerate that his own student had challenged
and disproved his findings. From that day onward, Bohr did not talk to
Krogh.
Subsequently, Krogh quit Bohr’s laboratory, and started independent
research. Since the results on the gas transport in the lungs were against
the theory accepted by Bohr and other scientists of that time, Krogh
was reluctant to publish them. The results of his investigations on the
transport of oxygen and carbon dioxide in the lungs were ultimately
published in 1910 in a series of seven works (“the seven little devils”).
This discovery, important in itself was not the reason for the award of
Nobel Prize to Krogh.
222 | History of Medicine

After many years of work, Krogh was able to demonstrate the


mechanism of increased blood flow in the skeletal muscle during exercise.
He showed that normally, most of the capillaries in the muscle are
closed. Only a few open capillaries are able to provide nutrients to a
resting muscle. During exercise, when the metabolic demand of the muscle
increases, more capillaries open up and increase the amount of blood
flow to the muscle. These findings were published in 1919. Krogh
received the Nobel Prize in medicine for this work in 1920. Few scientists
have been lucky to get recognition so soon after they made a discovery.
Most of other Nobel Laureates in medicine had to wait for decades
before their contribution was considered worthy of an award.
Krogh did extensive research in exercise physiology. In 1910, he
developed an automatically controlled bicycle ergometer. The bicycle
ergometers developed by him are still in use in an exercise physiology
lab in Copenhagen. He also devised a system to calculate the body
weight during exercise. With this instrument he could record a decrease
in body weight by one gram (e.g. due to sweating).

JS HALDANE
John Scott Haldane (1860–1936) (Fig. 5.16)
was a British physiologist who made notable
contributions to respiratory physiology.
Haldane was born in Scotland and graduated
in medicine from the University of Edinburgh in
1884. Right at the beginning of his career,
Haldane got interested in the study of gases
found in the coal mines. In a series of experiments
on animals and himself, he succeeded in
elucidating the true physiological actions of
carbon monoxide, the poisonous gas found in Fig. 5.16: JS Haldane
20th Century Medicine | 223

the coal mines. Due to inhalation of carbon monoxide, often his own
mentation was affected. Once he sent a number of telegrams about his
welfare to his wife, one after another, thinking that he sent only one
telegram! He also devised certain tests by which even small concentrations
of carbon monoxide could be detected. He reported that small animals
such as birds or mice showed the toxic effects of the gas much before
man is affected. Therefore, he recommended the use of these animals to
give the warning of the danger.
Haldane spent a lot of time on the London Underground (Railway),
collecting samples in a jar that he hung out of the window. He usually
carried a hand-bag with “London Fever Hospital” inscribed boldly on it.
As a result, usually he was the only occupant of the compartment! He
proved that because of the smoke released by the coal engines of the
trains, the levels of carbon monoxide in the Underground system was
unacceptably high. As a result of his investigations, the Underground
Railway was soon electrified.
Between 1892 and 1900, Haldane developed a number of new
methods to investigate the various aspects of respiratory function. He
published a number of papers on methods for determining the respiratory
gas exchange, the amount of hemoglobin in the blood, the quantity of
tension of gases in the blood, the volume of blood, and the analysis of
air. The apparatus which he devised for the analysis of gases in air and
blood is widely used even nowadays.
The year 1905 saw the publication of the most important of his
physiological researches. He showed that carbon dioxide plays a central
role in the neural control of respiration. He showed that the respiratory
center of the brain is highly sensitive to the tension of carbon dioxide in
the arterial blood reaching it. Since carbon dioxide is the chief metabolic
product of tissue metabolism, the breathing is automatically adjusted
according to the degree of body activity. In 1905, he demonstrated the
224 | History of Medicine

importance of wet-bulb temperature (humidity) on the human tolerance


of high environmental temperature.
Haldane’s other notable contribution to respiratory physiology was
in the field of deep-sea diving. In 1905, the British Navy approached
him about the problems faced by deep-sea divers. At that time, the
Royal Navy offered very little formal training for the divers. Often,
pupils arriving from gunnery school were given a helmet and told to
dive, leading to disastrous consequences. Whenever a person dived to a
depth of 20 meters, he soon complained of exhaustion. Deaths were not
uncommon among the divers. Bends—severe pain in various parts of
the body were everyday occurrence among the divers. Experimenting
with animals and often diving himself, he studied the problem of deep-
sea divers. He concluded that the problem was due to the fact that air
was inhaled under high pressure under water and therefore, gases like
oxygen, carbon dioxide and even nitrogen got dissolved in tissue fluids.
On ascent, the dissolved gases evolved as bubbles which tore through
the tissues, especially the nerves, hence the severe pain. The amount of
dissolved gases increased with the depth and the duration of dive. After
exhaustive studies, Haldane came out with a series of tables which
indicated the optimum duration for the dive of a given depth as well as
the duration of ascent. His work also resulted in the development of
portable Self Contained Underwater Breathing Apparatus (SCUBA)
which made deep-sea diving safe not only in warfare but also for
commercial and recreational activity. In spite of such significant
contributions to the respiratory physiology, Haldane was not awarded
a Nobel Prize, possibly because he was closely associated with the
British Navy, and some of his discoveries were directly helpful in its
war-efforts during World War I.
20th Century Medicine | 225

GERHARD DOMAGK
DISCOVERY OF SULPHONAMIDES

Gerhard Domagk (1895–1964) (Fig.


5.17) was a German pathologist and
bacteriologist. He was warded the
Nobel Prize in Medicine, 1939 for the
discovery of a sulphonamide, Prontosil,
the first effective drug against bacterial
infections.
Domagk was the son of a school
master. When he joined a medical school,
WW I broke out. He left medical studies
and joined army as a soldier. He was
wounded in December 1914, but he
served the army for the remaining years
of the war as a paramedic. At the end of Fig. 5.17: Gerhard Domagk
the war, Domagk resumed his medical
education. By1930s Domagk was the director of Bayer’s Institute of
Pathology and Bacteriology. In 1932, Domagk made the discovery that
a red dye-stuff, named prontosil by him, protected the mice and rabbit
against lethal doses of staphylococci and streptococci. The drug was
yet to be tried on human patients. Domagk’s own daughter became very
ill with streptococcus infection. Since no other drug was available, she
was administered prontosil and cured of the infection within a few days.
Domagk’s work thus gave an effective tool to combat medical and surgical
infections. For this work, Gerhard Domagk was awarded the Nobel
Prize in medicine, 1939. However, the Nazi Government forced him
not to receive the award, because a Nazi critic Carl von Ossietzky had
been awarded Nobel Peace Prize in 1935. After the end of WW II,
226 | History of Medicine

Domagk was finally able to receive the Nobel Prize but not the monetary
portion of the prize since it had become time-barred.
The active portion of the prontosil molecule was soon isolated and
named sulphanilamide. Surprisingly, sulphanilamide was effective only
against infections when administered in an infected man or animal. It did
not kill the bacteria in vitro. This mystery was solved when the mode of
action of sulphanilamide was worked out in 1940. It was found that the
sulpha drugs inhibited the action of para-aminobenzoic acid, which the
bacteria need for synthesis of folic acid, essentially required for bacterial
growth. Thus, sulpha drugs were found to be bacteriostatic (preventing
growth of bacteria), not bactericidal. During WW II, sulphanilamide
powder was extensively used by American soldiers to prevent infection
in the open wounds. Over the years, the use of sulpha drugs brought out
certain side effects like allergic reactions and kidney stones. After the
discovery of penicillin and subsequently other antibiotics, the use of
sulpha drugs as anti-bacterial agents has declined. However, because of
the low cost, they are still being used in many developing countries.

ALEXANDER FLEMING
THE BEGINNING OF ANTIBIOTIC ERA

Sir Alexander Fleming (1881–1955) (Fig. 5.18) was a Scottish


bacteriologist and pharmacologist who is considered the father of
antibiotic era in pharmacology and medicine. His most well-known
achievements are the discovery of the enzyme lysozyme and isolation
of the antibiotic penicillin from the fungus Penicillium notatum, in 1928.
For this work, he shared the Nobel Prize in medicine, 1945 with
Florey and Chain.
Having worked in a shipping office for four years, Fleming inherited
some money from an uncle. With this fund, he enrolled in a medical
20th Century Medicine | 227

school in London and qualified with


distinction in 1906. Now he became an
assistant to a bacteriologist Sir Almoroth
Wright, a pioneer in vaccine therapy and
immunology. He served throughout WW I
as captain in the Army Medical Corps.
During this period he had the first hand
experience of havoc produced by the
bacterial infection in the gun-shot wounds
which often necessitated amputation.
During the war, he found that the antiseptic
treatment of the wounds often led to severe
damage to the healthy tissues as well, Fig. 5.18: Sir Alexander
resulting in delayed wound healing. Fleming
Therefore on return to the civil life, he
started an active research into antimicrobial agents. In 1922, Fleming
discovered lysozyme, the body’s own bactericidal agent present in many
body fluids.
In 1928 Fleming became professor of bacteriology in St Mary
Medical School. At that time, Fleming was well-known to be a brilliant
but careless researcher. The cultures he was working on were often
found forgotten in some corner of the usually chaotic research laboratory.
After returning from a long vacation in September 1928, he decided to
clean his research table of all the useless culture plates. He was surprised
to find that in one of the staphylococcus culture plate there was a
greenish mould around which an area, in the shape of a ring, was free of
any bacterial growth. Fleming concluded that some product of the fungus
had killed the bacteria (Fig. 5.19). Thus the first antibiotic was discovered.
On further enquiry, the fungus was found to belong to penicillium family.
Therefore, the product was named penicillin by Fleming. He investigated
228 | History of Medicine

Fig. 5.19: Petri dish with penicillin mould

the effect of penicillin on a number of other bacteria. It was found that


the antibiotic was effective against all gram-positive bacteria like those
which produced scarlet fever, pneumonia, gonorrhea, meningitis,
diphtheria, etc. The results were published in the British Journal of
Experimental Pathology in 1929, but penicillin was put to clinical use
only in 1944.
Soon after publication of the report mentioned above, Fleming found
that it was difficult to grow penicillium mould, and having done it, it was
still more difficult to refine its product. The action of the crude product
was so slow that Fleming was convinced that penicillin would not be
able to kill the bacteria in the human body. Therefore he showed no
further interest in the subject and started devoting his time to other
research projects.
20th Century Medicine | 229

About ten years later, the research paper of Fleming on penicillium


came to the notice of two scientists, Howard Florey and Ernest Chain,
working in Sir William Dunn School of Pathology in Oxford, London.
They succeeded in developing a method of purifying penicillin to an
effective and stable form and published the results in 1940. On reading
the article, Fleming came to their lab to see what they had done. When
Fleming introduced himself, Chain exclaimed, “I thought you were dead!”
Still there was difficulty in mass production of penicillin. From
1941 clinical trials of the drug were started on the soldiers injured in
WW II. A greater part of the Penicillin administered to the patient was
excreted in the urine. In view of the scarcity of the drug, urine of the
patients was collected to extract penicillin, which was reinjected. There
was an urgent demand of the antibiotic from the army, but, because of
the uncertainties of the war, no British pharmaceutical company was
willing to invest money in the project. As a last resort, Florey visited
USA and sought the help of an American drug manufacturer for the mass
production of penicillin. By 1944, penicillin was available on commercial
scale and was extensively used the last months of WW II. Penicillin got
the nick name: “the wonder drug”. The antibiotic era thus started in
1944. Fleming, Chain and Florey were awarded Nobel Prize in
medicine, 1945, for the discovery of penicillin and developing it into a
therapeutically useful form.

SELMAN WAKSMAN
Selman A Waksman (1888–1973) (Fig. 5.20) and his team of scientists
discovered a large number of antibiotics of which streptomycin was of
great importance since it made the treatment of tuberculosis possible.
For this work, Waksman was awarded Nobel Prize in medicine in
1952.
230 | History of Medicine

Waksman was Russian by birth, but


migrated to the United States just after
passing high school. He got Master’s
degree in agriculture and a PhD in
biochemistry. He was appointed pro-
fessor of soil microbiology in 1930, a
position he held till retirement. In 1932,
he was approached by the American
Tuberculosis Association to find out how
tubercular bacilli were destroyed when
the sputum loaded with such bacteria or
many other pathogens was mixed with
the soil. He proposed that the soil bacteria
defended themselves by producing an Fig. 5.20: Selman
unknown substance that destroyed the A Waksman
tubercular or other bacteria. He also coined the term “antibiotic” for the
substances produced by one microorganism that suppress the growth
of another. Thus began the search for an antibiotic effective against
tubercular bacillus. In the next 10 years, Waksman and his team screened
10,000 soil samples and found a number of antibiotics but their toxicity
made them unfit for human use. In 1941, Waksman was about to be fired
from the job, since the finance department was not willing to pay 4600
dollars per year to “play around with the microbes.” Somehow, Waksman
retained his job and in 1942 he isolated streptomycin, an antibiotic that
made a complete cure for tuberculosis possible. For this discovery,
Waksman was awarded Nobel Prize in medicine, 1952.
The award of Nobel Prize generated quite a controversy. A large
number of graduate students were involved in the research activity in
the lab of Waksman. One of them, Albert Schatz staked his claim for the
20th Century Medicine | 231

award. It was like a foot soldier claiming credit for winning a battle.
However, to end the litigation and the associated unpleasant publicity,
Waksman made an out-of-court settlement for a substantial sum of
money.

HISTORY OF TUBERCULOSIS
Mycobacterium tuberculosis has infected humans since thousands of
years. Fragments of vertebral columns of Egyptian mummies (2400
BC) have been found to show definite lesions of tubercular decay. In
460 BC Hippocrates identified “phthisis” as the most prevalent disease
of the times. He also observed that it killed every one who was infected.
He even advised physicians not to visit patients in late stages of the
disease to avoid the danger of catching the disease. The ancient Auervedic
books on Hindu art of medicine also give description of the disease.
During the Industrial Revolution in Europe, tuberculosis was
rampant in the urban poor. Such patients with red swollen eyes, pale
skin, coughing blood, gave rise to the legend of vampires. It was believed
that such persons would be sucking human blood to replenish the amount
lost in the sputum. So to get rid of the “vampire,” the body of a patient
who had died of tuberculosis, was often dug out from the grave; the
heart was taken out and burnt.
As early as 1679, Sylvius wrote his Opera Medica, in which he
described the tubercles as the characteristic lesions in the lungs and
other tissues of the patient. After it was established that the disease was
contagious, tubercular patients were often forced into the sanatoria,
because good nutrition and fresh air was considered an effective treatment.
However stay in the sanatoria was mostly useless and usually the
patient died within five years of the admission.
A Danish doctor, Niels Finsen advocated sunbathing (photo-
therapy) as a treatment for tuberculosis. His disciple and a promoter of
232 | History of Medicine

this claim, August Rollier, opened a chain of high-altitude tuberculosis


sanotoria. Acres of near-naked bodies arrayed on cots, soaking up
ultraviolet rays in cool bright alpine air could be seen in the last decade
of 19th century. This method of treatment of tuberculosis fetched Finsen
a Nobel Prize in medicine, 1903, even though no tubercular patient
could possibly have been cured of the disease.
In 1882, the tubercular bacillus was identified and described by
Robert Koch. Koch was awarded the Nobel Prize in medicine, 1905
for this work. Still no remedy was in sight. An attenuated bovine strain
of Mycobacterium tuberculosis was developed as a vaccine by two
French scientists, Albert Calmette and Calmille Guérin in 1906. It
was named BCG (Bacillus of Calmette and Guérin). The BCG vaccine
was used in France from 1921 onwards but due to false national pride,
it was not used in UK, USA or Germany until after the end of WW II.
In 1815, one in four deaths in England was due to tuberculosis. One
hundred years later, in 1918, tuberculosis was still the cause of death in
one out of six patients. Public health measures had somewhat decreased
the incidence of tuberculosis but the break through was the discovery of
streptomycin in 1942. There was a sharp decline in the incidence of
deaths from tuberculosis in early 1950s. However hopes of total
eradication of tuberculosis were belied by the rapid development of
resistance to streptomycin. The emergence of other antitubercular drugs
gave some hope but still the disease is rampant in the developing countries
of Asia, Africa and Latin America. According to a WHO estimate, even
now, each year eight million people are infected and three million people
die from tuberculosis.
When Tubercular Patients Danced with Joy. Soon after the use
of streptomycin in the treatment of tuberculosis began, it was realized
that after an initial phase of recovery, often the lesions persisted in spite
of the continued medication. The reason was that the tubercular bacilli
soon developed resistance to streptomycin. Therefore, within a few
20th Century Medicine | 233

years of the discovery of streptomycin, search began for other anti-


tubercular drugs. In late 1940s, two drugs, isoniazid and iproniazid,
were isolated which showed some antitubercular activity in animals.
The drug trial was started in a large tuberculosis hospital in advanced
cases of streptomycin-resistant tuberculosis. One group of patients,
who received isoniazid showed dramatic recovery. The other group who
received iproniazid did not show any favorable response. As expected,
these patients were dull, apathic and highly depressed. But, within a
few weeks of the start of the iproniazid therapy, most of the patients
were found to be “inappropriately happy,” in spite of the absence of any
improvement in their tubercular lesions. Some of them even began to
sing and dance in the ward. These results were later picked up by a
psychopharmacologist, Nathan S Kline and iproniazid began to be
used as an important antidepressant drug.

RUSTOM JAL VAKIL


Rustom Jal Vakil (1911-1974) (Fig. 5.21)
was an eminent cardiologist of Bombay.
He is remembered for the introduction of
the first really effective antihypertensive
drug, Rauwolfia serpentine, to the world.
Vakil was the only son of a well-
known medical practitioner of Bombay.
His father died when Vakil was only a
child. His mother, a lady of wit and
determination, was responsible for the
upbringing of the child. She even
accompanied him to London when Vakil
was admitted to a medical school. When
Vakil was of school going age, she tried to Fig. 5.21: Rustom Jal Vakil
234 | History of Medicine

get him educated in an elite school. During the prolonged interview for
the admission, the mother and child were kept standing by the Principal,
an English woman. At the end, she was informed that her son was not
good enough for the school. To this, Mrs Vakil retorted: “I am not sorry
that my son fails to get admission in your school. He might have picked
up the bad manners of the Principal who does not even have the courtesy
to offer a seat to a lady.” The “not good enough” boy was to become an
international celebrity within a few decades.
Vakil graduated in medicine from a medical school in London in
1934. During his undergraduate medical carrier, Vakil won 27 medals.
He passed MRCP examination in 1936 and an MD from the same
university.
Vakil returned to Bombay in 1938 and started private medical practice.
He was a consultant to many big hospitals in Bombay, including the
King Edward Memorial Hospital and Grant Medical College Hospital.
Soon he earned a good reputation in the treatment of heart ailments, and
became the first cardiologist in India. Vakil started a meticulous collection
of clinical data, particularly on the role of an Ayurvedic herb Rauwolfia
serpentine in patients suffering from hypertension.
Rauwolfia serpentine is an ancient Ayurvedic medicine. Practitioners
of traditional Indian medicine used this herb for the treatment of a
variety of ailments, particularly as a sedative in cases of insanity or as
an antidote to stings of insects or snake-bites. The Indian medical
fraternity (practitioners of allopathic system of medicine) had got
interested in the herb as early as 1931. Within a decade many papers
were published in the prestigious Indian Journal of Medical Research on
its pharmacological properties. By the time Vakil started the medical
practice in Bombay, the herb was available as a tablet under the brand
name of “Serpina.” Vakil’s first publication on the beneficial role of
serpina in hypertension was as a preliminary note in the Medical Bulletin,
Bombay in 1940. Then onwards, Vakil started using serpina in all cases
20th Century Medicine | 235

of hypertension, maintaining meticulous records of the results. By the


time Vakil brought the drug to the attention of international medical
community in 1949, serpina was the most popular antihypertensive
remedy in India. A single manufacturer of the drug claims to have sold 50
million tablets of serpina between 1940 and 1950.
In the late 1940s, the blood pressure measurement had become a
routine part of clinical examination of a patient. Consequently,
hypertension was detected in large number of patients, many of which
had no symptoms directly related to the disorder. Except the use of
diuretics, medical profession could not offer any effective anti-
hypertensive drug. Some sympatholytic drugs, no doubt, were available,
but because of very unpleasant side effects, they were seldom accepted
by the patients. At this juncture, in 1949, Vakil published a paper in the
British Heart Journal and thus brought R. serpentine to the attention of
the international medical community. In this paper Vakil had summarized
10 years of his experience with the drug. The data was scientifically
collected and analyzed. Hence the discovery of an effective and safe
antihypertensive drug caught the imagination of the Western world.
Within five years of the publication of the paper by Vakil, about 100
papers appeared worldwide confirming the effectiveness of R. serpentine
as an antihypertensive agent. In 1952 the active ingredient of R. serpentine
was isolated in the West and named reserpine.
Concurrently, other therapeutic uses of reserpine came to be
recognized. In 1954, Nathan Kline reported the use of R. serpentine in
the treatment of neuropsychotic disorders. For his “trail-blazing and an
epoch-making discovery,” Rustom Jal Vakil was awarded numerous
awards including Padma Bhushan conferred by the President of India;
Dr BC Roy Award; Shanti Swarup Bhatnagar Award and the Dhanvantari
Award and many international awards.
Gradually, with time, many other effective antihypertensive and
antipsychotics drugs were discovered. Now reserpine is seldom used,
236 | History of Medicine

but it shall remain a bright chapter in the history of hypertension,


particularly as an Indian contribution to the international medical science.
Dr Rustom Jal Vakil died at the age of 63, from aortic dissection and
myocardial infarction.

ROLE OF BATTLE-FIELDS
IN MEDICAL RESEARCH
Since middle ages, the battle-fields have been instrumental in many
discoveries and advances in medicine. Battle-fields were considered
Schools for Surgeons because thousands of casualties would provide
experience that life-time practice among civilians could not provide. The
variety of injuries seen in a battle are not possible in civilians. Moreover,
a surgeon was free to try different techniques of treatment, or experiment
with new ones, with no questions asked. That is why army surgeons
were first to write detailed reports on conditions such as hospital gangrene,
tetanus, erysipelas, etc.
Ambroise Pare was a French barber-surgeon with a vast experience
in battle-field surgery. He is remembered for the vascular ligature “Pare’s
ligature” devised by him in the middle of 17th century as a substitute to
the use of boiling hot oil or a hot iron to stop bleeding from a ruptured
blood vessel.
Battle-fields also served as arenas for evaluation of a new advance in
medical research. For example, in the Crimean War (1854-55), Florence
Nightingale could demonstrate the benefits of hospital sanitation in
reducing the hospital mortality. During Franco-Prussian War (1870-71)
the importance of antiseptic surgery was proved. The Germans, who
adopted Lister’s antiseptic methods, had far better outcome of casualties
than the French who ignored Lister completely.
In the Crimean War, more troops were killed by typhoid than by
the enemy. An army surgeon, Almroth Wright introduced the typhoid
20th Century Medicine | 237

vaccine. Wholesale inoculation of the British troops was attempted in


the Boer War (1899-1902) but it met with such a stiff resistance that
only 4% of the troops could be inoculated. As a result of this blunder the
army suffered about 60,000 cases of typhoid fever resulting in death of
10,000 troops. During the World War I, slightly better acceptance of
typhoid vaccination reduced the number of typhoid cases to 20,000 and
1000 deaths. In the World War II, compulsory typhoid vaccination of
the British troops dramatically reduced the incidence of typhoid.
During the World War I, orthopedic and reconstructive surgery
made great strides. Thousands of soldiers were discharged from the
hospitals with gross functional disabilities. Dr Robert Jones was the
well-known orthopedic surgeon who organized big orthopedic hospitals
with rehabilitation centers for such soldiers.
By the time of World War I, the cause of malaria was fully established
and quinine was the only drug for its treatment. The main source of
supply of the drug was Dutch East Indies. Ample supply of quinine
was essential for any military campaign in the tropical and subtropical
countries. The Germans became apprehensive that their supply root of
quinine may be cut off. Therefore they started looking for an alternative
drug. Only in 1933, they could produce mepacrine. By the time of
World War II, the Germans had synthesized another quinine substitute,
chloroquin, now the most widely used antimalarial drug.
Penicillin was discovered by Alexander Fleming, a British scientist,
in 1928. However it was not used therapeutically till 1941. It was
known that large amount of penicillin is excreted in the urine. At that
time its supply was so little that in a patient on penicillin therapy,
penicillin was extracted from the urine of the patient and reinjected. The
drug was urgently required by the troops but, because of the war, no
pharmaceutical company in the UK was willing to invest money for the
commercial production of penicillin. A British delegation visited the
USA to invite them to produce penicillin.The American pharmaceutical
238 | History of Medicine

companies plunged into the project whole-heartedly, produced millions


of units of penicillin and saved the life of thousands of soldiers.
Nitrogen mustards, the cytotoxic anticancer drug, are also a product
of war. During WW I, mustard gas was used by the Germans causing
heavy casualties in the British troops. During autopsies on the victims,
aplasia of the bone marrow, dissolution of lymphoid tissue and ulceration
of the GI tract was noticed. In the WW II, the Allied forces also used war
gases against the enemy. The subsequent reports of the effects of the
war gases led to their use as anticancer drugs.

DISCOVERY OF VITAMINS
The term vitamine was coined by a Polish biochemist, Casimir Funk,
in 1912. On the basis of the knowledge available at that time, Funk came
up with a revolutionary idea that some diseases could be caused by lack
of some trace substance in the diet. He cited three diseases as typical of
these deficiency diseases: xerophthalmia, an eye disease, beriberi, a
deadly neurological disease of the Far East, and scurvy. The unknown
hypothetical substances, whose dietary deficiency caused these diseases
were named vitamines A, B, and C respectively for the three diseases.
(Vita in Latin means life. All these factors were thought to be amines).
When later researches revealed that some of these nutritional factors
were not amines, the term was modified to vitamin. Actually most of
the deficiency diseases and their treatment were known much before the
vitamins were discovered. A brief history of these diseases and the
discovery of their treatments would be found educative as well as
interesting.
Scurvy has been recorded by Hippocrates. However, the epidemics
of the disease were usually found in the seamen on long voyages for the
purpose of trade or colonization. The sailors were on high seas for often
as long as one year or more. The first outbreak of scurvy was recorded
20th Century Medicine | 239

in the crew of Vasco da Gama during his expedition to India in 1497.


During that trip, da Gama lost more than half of his sailors mostly
because of scurvy. In 1520, Magellan lost more than 80% of his crew
while crossing the Pacific Ocean. Scurvy came to public knowledge in
England in 1740s, when a navy expedition in the Pacific Ocean lost 1400
seamen out of a total of 2000, due to scurvy. The official account of the
voyage gave the following description of the victims of scurvy: skin as
black as ink (subcutaneous hemorrhages), ulcers, severe pain in the
bones (subperiosteal hemorrhages), gums swollen over the teeth, and
gums became rotten which gave an abominable odor to the victims’
breath. Later the teeth fell out of the jaws. The patients showed sensory
and psychological overreaction. The mere sound of a gun-shot was
enough to kill a man in the last stages of scurvy. A severe internal
hemorrhage was usually the terminal event.
In the reports of the above-mentioned voyages, there were brief
mentions of the benefits of green vegetables and citrus fruit in the
prevention of scurvy. However, no one took any notice. In 1747, a
Scottish navy surgeon, James Lind
(Fig. 5.22), proved the efficacy of
citrus fruit and lime in the treatment
of scurvy on a British naval ship,
Salisbury. He chose 12 patients with
severe scurvy and housed them in a
sick bay. They were given the same
basic diet. The 12 patients were
divided into six groups and each
group was given a different remedy
which was claimed to cure scurvy—
cider, elixir of vitriol, vinegar, sea
water, 2 oranges plus 1 lemon each
day, and a medicinal paste made of Fig. 5.22: James Lind
240 | History of Medicine

nutmeg and some other ingredients. The group receiving oranges and
lemon each day made a remarkable recovery in 6 days. Others showed
no improvement in the disorder. Unwittingly, Lind performed the first
controlled experiment comparing the results in groups in which all other
dietary factors were same except the citrus fruit. He published his
results in a book, Treatise on Scurvy. However, his findings were
completely ignored for another 50 years, partly because provision of
fresh fruit on the high seas was very expensive. It was in 1795 that the
British navy adopted lemons as a standard part of food at sea. Since no
other nation adopted this article of food in the diet of seamen, British
sailors were nicknamed as Limeys. One historian on Nautical Medicine,
Louis H Roddis writes: “In the 200 years, from 1600 to 1800, nearly
1000,000 men died of an easily preventable disease. There are in the
whole of human history few more examples of official indifference and
stupidity producing such disastrous consequences to humans.”
As scurvy became rare at sea in the 19th century, epidemics of
“land scurvy” were noticed. The populations of the great Potato Famine,
the armies of Crimean War, and American Civil War, the Arctic explorers
and the California Gold Rush were prominent victims of scurvy because
the knowledge of prevention of sea scurvy was not translated in
prevention of land scurvy. Ascorbic acid was identified as the active
antiscorbutic factor in the citrus fruit by a Hungarian scientist, Szent-
Gyorgyi, in 1928, for which he received the Nobel Prize in medicine
in 1937.
Beriberi has been endemic since centuries in South East Asia where
polished rice is the staple diet. The term is derived from a Sinhalese
word meaning “extreme weakness.” In the form known as dry beriberi,
there is degeneration of nerves followed by degeneration of muscles. In
wet beriberi, a more acute form, edema and congestive heart failure are
the prominent features. The first insight into the cause of beriberi came
in 1880s when Dr K Takaki, the Director General of the Japanese
20th Century Medicine | 241

Naval Medical Services, noticed a correlation between the diet of the


sailors, mainly rice, and beriberi. He ordered an increase of vegetables,
fish and meat in the diet of the sailors. Within 6 years, the incidence of
beriberi in Japanese navy dropped from 40 to 0%. These finds helped
the Japanese military, but no other
country in South East Asia took notice
and epidemics of beriberi continued. For
example, in 1886, a Dutch medical officer,
Dr Christian Eijkman (Fig. 5.23), held
the view that beriberi was an infectious
disease and spent many months trying
to find the microbe responsible for it.
During these investigations, Eijkman
noticed symptoms resembling human
beriberi in a group of chickens. After
consulting the hospital records and con-
ducting experiments, Eijkman concluded
that the chicken fed on (white) polished
rice developed the ailment, but not those Fig. 5.23: Christian Eijkman
fed (red) partially polished rice and that the rice hulls even cured the
disease. In 1911, a chemist, Dr Casimir Funk crystallized the amine
substance from rice bran and called it a vitamin. However the real
antiberiberi factor, thiamine (vitamin B1) was discovered by Dutch
chemists, Jansen and Donath, in 1926, while working in Eijkman’s
laboratory. Eijkman was awarded the Nobel Prize in medicine in
1929 for his work on dietary-deficiency diseases.
Pellagra is characterized by skin rashes, (especially when exposed
to sunlight), mouth lesions, diarrhea, mental degeneration and finally
death (the four Ds—dermatitis, diarrhea, dementia, death). The disease
was first noticed by scientists in Europe around 1720, when maize (or
Indian corn) was heavily imported from the Americas and planted in
many countries. Pellagra was highly prevalent among the poor population
242 | History of Medicine

of the United States, whose staple diet was maze-based. Probably the
peak incidence was seen in 1928 when thousands died in the epidemic of
pellagra in the USA. Dr Joseph Goldberger, an American physician,
made a detailed study in the disorder from 1914 onwards. He chose
volunteers in a prison and divided them into two groups. One group
received typical poor man’s diet consisting of cornmeal, molasses and
pork meat while the other group received meat, fresh vegetables and
milk as well. Within a few months, the first group developed typical
pellagra. Now they were also given fresh vegetables and milk. The
symptoms quickly disappeared. Goldberger named the unknown factor
present in the milk and fresh vegetables as “P-P factor” (pellagra-
preventing factor). Conard Arnold Elvehjem an American biochemist
showed that the elusive P-P factor was niacin (vitamin B3).
In 1920s and 30s, many nutrients were discovered which were
believed to be essential for normal health and not formed in the human
body. All these newly discovered vitamins were named as consecutive
letters of the alphabet, namely, vitamins A, B, C, D, E, F, G, H, I, J, and
K. Subsequently some of them were found to be not really essential and
therefore dropped from the list of vitamins. Some others were
redesignated as components of vitamin B complex, e.g. vitamin B2, B3,
B6, B11 and B12. The term vitamin F was initially applied to a group
unsaturated fatty acids (linoleic acid, linolenic acid and arachidonic acid)
which were essential for normal health and could not be synthesized in
the body. Thus, they fulfilled the criteria for being called a vitamin, but
it was considered better to classify them as essential fatty acids rather
than as a vitamin.

DISCOVERY OF ENDOCRINES
Most of the endocrine glands were identified in the 16th century but
their functions were established only in the first-half of the 20th century.
20th Century Medicine | 243

For example, the structure later named the thyroid gland was described
by Andreas Vesalius in the 16th century as a part of the larynx. The
anatomy of the gland was fully described by Thomas Wharton in
1656. He gave it the name thyroid since it resembles the ancient Greek
shield. However, the functions assigned by him to the structure are
rather amusing. Wharton believed that the viscous fluid within the follicles
lubricated the trachea. He also believed that it had a cosmetic function—
to give grace to the contours of the neck, since it was found to be larger
in the female. Later on, when its high vascularity became known, it was
believed to act as a vascular shunt for the brain. “It was bigger in females
to protect them against more numerous mental irritation and vexation of
mind to which they are more often exposed than males.”
The disorder Addison’s disease is named after Dr Thomas Addison,
a 19th century physician in London. In 1855, he published a book “On
the Constitutional and Local Effects of the Disease of the Supra-renal
Capsule.” This book laid the foundation of modern clinical
endocrinology. The description of the disorder is in such a detail that not
much has been added to the clinical picture since then. He also described
the results of the postmortem examination in which he always found
pathology in the “supra-renal capsules,” (as the gland was named at that
time).
Thirty years after Addison’s monograph, a French physician, Pierre
Marie described and named another endocrine disease, acromegaly, in
1886, in a paper entitled “Two cases of Acromegaly: An Unusual
Hypertrophy of the Head and Upper and Lower Extremities.” On
autopsy, Marie detected an enlargement of the pituitary gland, but he
failed to correlate the two because at that time most of the physiologists
held the view that “the pituitary gland had little or no role in body
function of the higher vertebrates.” However, in 1891, Marie reported
that enlarged pituitary glands were always found in autopsies on the
cases of acromegaly.
244 | History of Medicine

With the discovery of antiseptic


techniques surgery became relatively safe
in 1870s. In Switzerland, thyroidectomy
was frequently per-formed as a cosmetic
treatment of large goiters. Emil T Kocher
(Fig. 5.24) performed the first
thyroidectomy in 1876. By 1917, when
he died, about 7000 thyroidectomies had
been performed by him. Emil T Kocher
was awarded the Nobel Prize in
medicine, 1909, for his work on the
physiology, pathology and surgery on
thyroid gland. Unfortunately many of
these patients developed myxedema, but Fig. 5.24: Emil T Kocher
it was variously attributed to injury of the nerves or trachea or “to the
loss of some blood-making function of the gland.” In 1891, George
Murray described his successful attempt to treat human myxedema
with extracts of sheep thyroid. At that time, the discovery of the use of
thyroid extract in the treatment of myxedema was overshadowed by the
claims of a French physician, Brown-Séquard, in 1889 that at the age of
72, he had rejuvenated himself by injecting testicular extracts of dogs
and guinea pigs. Despite complete rejection of the claim by the medical
fraternity, by 1890, about 12000 physicians were prescribing testicular
extract to their patients. The report of Brown-Séquard on the benefits
of testicular extracts was to be an ignominious end to a brilliant track
record of Brown-Séquard in neurophysiologic researches.
The term hormone was used for the first time by a British physician,
Ernest Starling, in 1902, in reference to the newly discovered chemical
substance, secretin, regulating the secretion of exocrine pancreas. Ernest
Starling and William Bayliss, a physiologist, were investigating the
regulation of the pancreatic secretion. Till that time only the neural
20th Century Medicine | 245

control of gastric secretion had been discovered by Pavlov. They found


that the secretion of pancreatic juice was markedly increased when
acidic chyme entered the duodenum. Even after complete denervation of
the two organs, presence of an acidic chyme increased the pancreatic
secretion. The two scientists concluded that “it must be a chemical
reflex.” To test the hypothesis, they prepared an acid extract of the
duodenal mucosa and injected intravenously into the anesthetized dog.
Within a few minutes, the pancreatic secretion increased. Thus, the
hormonal control of gastrointestinal secretions was discovered for the
first time, and the chemical was named as secretin.
The discovery of the functions of adrenal cortical hormones was a
long drawn-out process. By 1929, Dr
Edward C Kendall (Fig. 5.25), an
American physician and scientist, had
isolated thyroxin (and glutathione).
Then began his search for the hormones
of adrenal gland which lasted two
decades. By 1938, he had collected
adrenal glands of thousands of beef cattle
and isolated six steroids and named
them compounds A, B, C, D, E, and F,
using the sequence in which they were
discovered. But, he could not discover
physiological role of any of them.
Consequently, the research project was Fig. 5.25: Edward C Kendall
proposed to be closed in 1941. However, the Allies of WW II received
“secret information” that the German pilots were able to fly their
airplanes at an altitude of 40,000 ft. because they received injections of
adrenal gland extract before the flight. The report was, of course, false
but the American government released funds for Dr Kendall to conduct
further research on adrenal corticosteroids. By late 1940s, the compound
246 | History of Medicine

E could be synthesized. Though cattle were no more required to obtain


corticosteroids, no one knew what to do with the drug. It could be tried
in Addison’s disease, but the disorder is so rare that any meaningful drug
trial was impossible. Before any physiological role could be assigned to
them, adrenal corticosteroids were found to be useful in the treatment of
rheumatoid arthritis.
Rheumatoid arthritis is one of the most crippling diseases. In the
United States, its incidence is as high
as 4%. There was no treatment for the
most painful deformities caused by it.
Dr Philips Hench (Fig. 5.26), head of
the unit of arthritic diseases in Myo
Clinic had noted that the women with
rheumatoid arthritis were relieved of the
symptoms during each pregnancy. In
1948, Kendall and Hench decided to
try the compound E (later called
cortisone) on a woman suffering from
rheumatoid arthritis. The patient
showed dramatic recovery within a few Fig. 5.26: Philips Hench
days. Subsequently, compound E was tried on many similar patients, all
of whom recovered. Kendall and Hench were awarded Nobel Prize
in medicine, 1950, for the work on adrenal corticosteroids.

FREDERICK BANTING
DISCOVERY OF INSULIN

Frederick Banting (1891-1941) (Fig. 5.27) was a Canadian physiologist


best known for the discovery of insulin. In 1923, he was awarded Nobel
Prize in medicine for this work.
20th Century Medicine | 247

Banting was born in Ontario, Canada.


He graduated in medicine in 1916 and
immediately joined the Canadian Army
Corps. He won the Military Cross for his
work in the World War I. He started medical
practice in 1920. Soon he became
fascinated with the idea of research on
insulin. By that time, it had been shown
that removal of pancreas of a dog resulted
in develop-ment of diabetes mellitus.
There was strong possibility that some
product of the islets of Langerhans was Fig. 5.27: Frederick Banting
responsible for maintaining normal blood
glucose level. However, all efforts to extract the material from the islets
were foiled by the action of exocrine pancreatic enzymes. Banting,
theorized that if the exocrine pancreatic acini were destroyed by tying
the pancreatic duct, it may not affect the islets. Thus it might be possible
to obtain an extract of the islets of Langerhans. He presented this idea to
John Macleod, a professor of physiology in the University of Toronto,
Canada. Macleod, at first scoffed at the idea. Banting pestered him until
finally he was given a laboratory space, 10 experimental dogs and a
young medical student, Charles Best, as an assistant.
In May 1921, Banting and Best started the animal experiments. By
tying the pancreatic duct, they were able to create atrophy of the
pancreatic acini, without damaging the islets of Langerhans. Solution
extracted from these cells was injected into the dogs in which diabetes
had been produced by complete removal of the pancreas. The diabetic
dogs quickly recovered from the experimentally produced diabetes.
Another scientist, James Collip, a member of the team of Macleod,
helped in the purification and isolation of insulin. The results were
published in February, 1922. The discovery was considered so important
248 | History of Medicine

that the very next year (1923), Banting and Macleod were jointly
awarded the Nobel Prize in medicine. Study of the history of the Nobel
Prizes in medicine would reveal that mostly, the award was given many
years after the discovery, only when its usefulness to the humanity was
fully established. Banting and Macleod are probably the one of the few
scientists whose work received the award within a year of publication
of their research work.
Charles Best was a coauthor of the award-winning research
publication. But his claim to the award was ignored because at that time
he was merely a medical student. Banting publicly expressed his anger
at the decision and split his share of the award money with Best, whereas
Macleod split his share of the prize with Collip. (Later Charles Best
wrote the famous textbook of physiology). In a most selfless move, the
four scientists decided not to seek a patent for the life-saving discovery.
As a result, within a few months, insulin was available in the market at
a very low price.
In late 1930s as the war with Germany looked imminent, Banting
renewed once again his interest in military service. He played a major
role in the creation of G-suits, which were used by the Royal Air Force
during World War II. In February 1941, Banting died in a crash of an air
force plane. After the crash, Banting is said to have bandaged the pilot’s
wounds, before he succumbed to his own injuries.

WALTER CANNON
Walter Bradford Cannon (1871-1945) (Fig. 5.28) was a renowned
American physiologist. He is remembered for his researches in
gastrointestinal motility, concept of homeostasis, and above all the term,
“fight or flight” reaction to describe an animal’s response to a life-
threatening situation.
20th Century Medicine | 249

As a medical student at Harvard


Medical School, Cannon found the
medical texts boring and sleep-inducing
(as are even nowadays). He observed
with some envy the enthusiasm of his
roommate, a student of Harvard Law
School. In the law school, the teaching
was based on case studies rather than
dry didactic lectures. Realizing the
similarities between the medical and
legal case histories, as a medical school
senior, Cannon wrote an article
suggesting that hospital records should Fig. 5.28: Walter
be used to teach medicine. In the years Bradford Cannon
that followed, his suggestion was adopted by all the medical schools in
the USA.
His first area of research was the study of gastrointestinal motility
using the recently discovered X-ray technology. He was the first to use
the barium “meal” as a tool for the study of gastrointestinal motility.
The work on swallowing and gastric motility was done by him while
still a medical student. The result of this research was published in the
first issue of the American Journal of Physiology, in 1898. Further
studies in this field resulted in his first monograph, “Mechanical Factors
of Digestion” (1911). In this book, he described the effects of emotions
on the digestive processes. In the next decade, Cannon was busy
investigating the role of sympathetic nervous system, and the adrenal
medulla. The observations of increased secretion of adrenal medulla led
him to develop the important concept of “fight or flight reaction.”
Cannon’s most enduring work was the concept of homeostasis,
expounded in a book, “The Wisdom of the Body.” This book reads like a
book of wonders as it describes the extraordinary complex mechanisms
operating in the body for the preservation of the internal environment.
250 | History of Medicine

Walter Cannon was a firm believer in the animal experimentation. In


those days, a strong lobby of antivivisectionists was as active in the
USA as in the UK. Walter Cannon was the key spokesman for the
medical establishment against the antivivisection movement. He was
the head of the Council for the Defense of Medical Research from 1908
to 1936. He helped to mobilize the medical profession, lobbied politicians,
testified in public hearings and wrote tirelessly in defense of animal
experimentation.
Walter Cannon suffered ill health for many decades. This and the
cancerous condition he developed later in life were probably the result
of his experiments with X-rays at a time when the dangers of X-ray
exposure were not known.

WALTER RUDOLF HESS


Walter Rudolf Hess (1881-1973) (Fig.
5.29) was a Swiss physiologist who
received (with Antonio Egas Moniz) the
Nobel Prize in medicine, 1949, for his
pioneering work on control of the visceral
function by the hypothalamus.
Early in his medical career, Hess
developed a great liking for physiology.
But financial reasons led him to become
first an assistant in surgery and later in
ophthalmology, and finally, he became a
practicing ophthalmologist. This detour
was to prove an advantage because Hess Fig. 5.29: Walter
learnt to operate with precision, an Rudolf Hess
essential requirement for a research worker in experimental
neurophysiology. In 1912, leaving a prosperous practice, Hess became
20th Century Medicine | 251

an assistant in physiology. By 1917, he rose to become the professor


and director of the Physiology Institute in the University of Zurich. He
remained on this post till 1951.
Using fine microelectrodes to stimulate or destroy specific areas in
the brain in cats, and dogs, Hess found the seat of control of autonomic
function lies in the hypothalamus, and medulla oblongata. He mapped
control centers for each function to such a degree that he could induce
the physical behavior of a cat confronted by a dog, simply by stimulating
the proper points on the animal’s hypothalamus.
During the experimental investigations of the diencephalons, besides
the changes in visceral activity, somatomotor effects were observed
relatively often. Reports of these psychosomatic phenomena, initiated
further the research into the psychosomatic disorders and the mode of
actions of the newly discovered psychotropic drugs.

EGAS MONIZ
Antonio Egas Moniz (1874-1955) (Fig.
5.30) was a Portuguese neurologist who
was the founder of the modern psycho-
surgery. He was awarded the Nobel Prize
in medicine, 1949, for the development
of prefrontal leukotomy (lobotomy) as a
radical treatment of certain psychotic
disorders.
Moniz studied medicine and neurology
in France and returned to Portugal in 1902,
but became more interested in politics than
medicine. He remained a member of
Portuguese Parliament from 1903 to 1917, Fig. 5.30: Antonio
and in this period, several times served as Egas Monaz
252 | History of Medicine

a Minister. He left politics and returned to the University of Lisbon in


1921, where he remained a Professor of neurology till 1944.
In 1927, Moniz developed the technique of contrast X-ray cerebral
arteriography (a method to visualize the cerebral vessels by injecting
into the carotid artery a substance opaque to X-rays). This procedure
helped to diagnose cerebral disorders such as brain tumors or
arteriovenous malformations.
In 1936, Egas Moniz developed for the first time a surgical technique
to interrupt the neural connections between the thalamus and the
prefrontal cortex. Prefrontal cortex is the part of the brain associated
with higher intellectual functions and the emotions. The operation was
done to treat patients of mental disorders in whom all other methods of
treatment had failed. The use of prefrontal leukotomy (lobotomy) became
widespread in 1940s and 1950s and then declined because of the
development of antipsychotic drugs as well as the fact that often the
operation created more damage to the patient than benefit. The operation
left most of the patients intellectually deficient. The patient lost the
understanding of the norms of social behavior. For example, a senior
officer who had undergone such an operation was once found to get up
in a meeting and piss into the waste paper basket lying nearby. In the
two decades, about 50,000 patients were subjected to this operation
worldwide. In view of the devastating effects of the operation, now
there is an effort by families of the lobotomized patients to persuade the
Nobel Prize committee to rescind the award given to Moniz. However,
the prefrontal leukotomy is still being performed in some psychotic
patients, who are dangerous to the society and in whom other forms of
treatment are ineffective.
20th Century Medicine | 253

WALTER FREEMAN
Walter Jackson Freeman (1895–1972)
(Fig. 5.31) was an American neurologist and
a psychiatrist who is infamous for life-long
advocacy of his ice-pick surgery of the brain
for the surgical treatment of psychiatric
patients.
Prefrontal leukotomy was initially
developed by a Portuguese neurosurgeon,
Egas Moniz in 1936. His reports in a medical
journal attracted the attention of Freeman
who was working as a psychiatrist in a
mental asylum in USA. Since Freeman had
Fig. 5.31: Walter
no surgical training, he requested a surgeon,
Jackson Freeman
Dr James Watts to assist him in performing
the operation.
The team operated on the first case in 1946. There was initially
some opposition to the operation among the psychiatrists, which
disappeared with the award of Nobel Prize to Egas Moniz in 1949. The
method of Moniz involved drilling holes in the skull and injection alcohol
into certain tracts going to the frontal lobe. Not being a surgeon, Freeman
had no patience in watching James Watts doing the lengthy procedure.
Therefore, Freeman devised a new method to destroy the frontal lobes
and called the procedure prefrontal lobotomy. Initially he used an ice-
pick and a hammer to perform the surgery; hence the procedure came to
be known as ice-pick surgery (Fig. 5.32).
In the presence of many visitors, the operation was often performed
in the outpatient clinic of Freeman. The patient was rendered unconscious
by electric shocks. Freeman would then take a sharp ice-pick like
instrument, insert it above the patient’s eyeball through the orbit of the
254 | History of Medicine

Fig. 5.32: Ice-pick surgery

eye, into the frontal lobe of the brain, moving the instrument back and
forth. Then he would do the same thing on the other side of the face. The
whole procedure took not more than 10 minutes. Initially, Watts helped
Freeman in the surgery, but soon parted company because he was
disgusted by the crude procedure being performed in most unscientific
manner. Undeterred, Freeman continued with the ice-pick lobotomies
for more than 20 years. In this period, Freeman operated on as many as
3400 cases alone.
By 1950, Freeman’s lobotomy was in full swing. Newspapers
described the operation being easier than treating a toothache. Freeman
was on the front page of all the leading newspapers and magazines of the
USA. The operation of lobotomy spread all over the world like a wildfire.
In all about 40,000 to 50,000 men, women and children were loboto-
mized. Women outnumbered men by two to one.
20th Century Medicine | 255

Freeman, as a showman, liked to shock his audience of doctors,


nurses and even laymen, by performing two-handed lobotomies:
hammering the ice-pick into the two eyes at once. People often fainted
when watching ice-pick surgery. Dr Edwin Zabriskie, a 74-year-old war
veteran, and a professor of clinical neurology, was observed to crumple
on the ground at the sight of freeman in action.
What was the reason behind the popularity of ice-pick surgery? In
late 1940s, after the end of World War II, the mental hospitals were
overflowing with thousands traumatized soldiers and their families.
With no medical treatment of psychosis available, they became a
permanent responsibility of the mental asylums. After the lobotomy,
once violent patients became so docile that they could be sent home to
live with their families. Freeman’s operation helped the mental hospitals
to get rid of the otherwise incurable patients. Freeman used to boast that
his operation costs only 200 dollars as compared to many thousand
dollars required per year to maintain a patient in the mental hospital. As
a result, directors of mental hospitals all over the USA requested Freeman
to perform the lobotomy on their patients.
Freeman advocated lobotomy not only to treat psychiatric patients
but also to “treat” social misfits like obstinate and difficult children or
“politically undesirable” people such as communists. Trouble makers
among the rich and famous were often subjected to the operation.
Rosemary Kennedy, a wayward sister of JF Kennedy, was given
lobotomy on the orders of her dad, Joseph Kennedy. After the operation,
the result was so devastating that she had to be confined to a mental
asylum till death. Another well-known victim of lobotomy was Frances
Farmer, a famously beautiful actress. She was brought to Freeman,
because her parents believed her to be too unruly. Actually she was a
communist sympathizer and a radical political activist. After the
operation she was found working as a shop assistant. “Prefrontal
256 | History of Medicine

lobotomy cures lots of troublesome ailments including nymphomania,


socialism and the insatiable thirst for freedom,” declared Freeman.
Prefrontal lobotomy did help the patient to get rid of suicidal or
aggressive behavior but it left the patient in a state of mind much worse
than death. The patient showed no emotions in the face of most tragic
events. There was absence of the sense of social or moral behavior.
Often, they needed to be trained how to eat or use a toilet for the calls of
nature. In short, the patients were no better than zombies.
Freeman’s popularity crumbled with the development of
psychotropic drugs. The final end came in 1967 when a patient developed
intracranial hemorrhage and died within a few hours of ice-pick surgery.
Since he had performed surgery even though he had no training in surgery,
Freeman’s license to practice medicine was revoked. (Why he was not
questioned on this issue during the previous two decades remains a
mystery). Even then, Freeman was not discouraged. He sold his house
and moved from town to town, meeting old patients, trying to prove
that the ice-pick surgery had changed the life of those unfortunate patients
“for the better.”

TREATMENT OF PSYCHOLOGICAL DISORDERS


Mental illness has always been a part of human experience. As in other
medical fields, the preferred treatments for mental illness have changed
with the changing theories of its causes. A bewildering variety of therapies
for mental illness—from exorcism to asylum, from bloodletting to
lobotomy—have been tried over the years without any reasonable degree
of success. During the last 50 years, the development of psychotropic
drugs has revolutionized the life of millions of patients with depression
or schizophrenia. The stigma of having a mental illness has lessened
considerably.
20th Century Medicine | 257

For centuries the so-called civilized societies looked upon


psychologically disturbed patients as subhumans, to be tortured or
locked away and forgotten. For hundreds of years, psychological
disorders and insanity were associated with the supernatural, possessed
by demons or the devil. One treatment that dates back to the Stone Age
involved cutting a hole in the patient’s skull to let out the evil spirits. In
17th and 18th century Europe, thousands of such patients were tortured
to “bring them to senses.” If that failed, they were burned or hanged. As
late as in 1940s, a psychiatrist in London advocated euthanasia for the
80,000 or more “idiots and imbeciles” of the country, since according to
him, “they are incapable of being employed and their care and support
absorbs a large amount of money and energy of the normal population.”
The ancient Greeks believed that “hysteria” was caused by incorrect
positioning of the uterus. Some eminent physicians such as Hippocrates
and Galen advocated fumigation of the vagina to draw the uterus back
into place. Hippocrates also advocated purgatives so as to remove the
“black bile”, the cause of melancholia (mental depression), from the
body.
In the mid-19th century, the treatment of a mental illness consisted
of bloodletting, purging, cold water immersion and various medications
containing opium or its derivatives. More serious cases, the patients
who could not be looked after by the families, were confined side by
side with criminals, unwed mothers and the destitute.
Upto18th century, it was common to see insane patients chained
and kept in mental asylums (Fig. 5.33). In England, chained insane men
and women, often naked, were displayed to the public for a fee. On
holidays, the people of London entertained themselves by visiting a zoo
and the nearby mental asylum. Some of the more “modern” treatments
used in first half of the 20th century such as electroconvulsive therapy
and lobotomies were no less brutal.
258 | History of Medicine

Fig. 5.33: Chained Insane 1500 AD

Fever treatment of mental disorders, particularly of the general


paresis of the insane, was started by Julius Wagner von Jauregg, an
Austrian physician. He observed that insane patients improved
considerably after surviving an attack of typhoid fever, erysipelas or
tuberculosis. High fever was the common feature of all these infections.
Therefore Jauregg started with different methods to induce high fever in
patients suffering from the general paresis of the insane, caused by
neurosyphilis. The mental wards were full of patients with the incurable
disorder. In the first decade of 20th century, the cause and the treatment
of malaria as well as the blood groups had been discovered. Therefore in
1917, Jauregg injected blood of a malarial patients in nine patients of
general paresis. After a few days of bouts of high fever, the malarial
infection was cured by administration of quinine. Of the nine, four
patients showed complete recovery and another two showed some
improvement. Subsequently, the fever therapy was tried in a series of
275 syphilitic patients. This was the first ever successful treatment of
a mental disorder. Jauregg received the Nobel Prize in medicine,
20th Century Medicine | 259

1927, for this discovery. By mid-1940, penicillin had been discovered


which was found to cure syphilis and the fever therapy was no more
used.
Insulin shock therapy for the treatment of mental disorders was
accidentally discovered in 1927 by Manfred Sakel, a young Polish
neuropsychiatrist. By that time insulin had been discovered and routinely
used in the treatment of diabetes mellitus. He was daily administering
insulin to a patient who was afflicted with both diabetes and psychosis.
On one occasion, by mistake, such a large dose of insulin was given that
the patient developed hypoglycemic convulsions. On recovery from
the convulsions, the patient showed considerable improvement in the
psychotic symptoms. By 1930, he started regular trials of insulin-
shock therapy in psychotics, particularly those suffering from
schizophrenia. According to his findings, more than 70% of the cases
improved after insulin-shock therapy. The results were published in
1933. By 1934, insulin-shock was in use all over Europe. Within years,
it was realized that insulin-shock therapy was not as effective as made
out to be, and was totally abandoned by the middle of 1940s.
Metrazol-shock therapy was started by Ladislaus Meduna, a
Hungarian psychiatrist, in 1933, the same year as Sakel announced his
insulin-shock therapy. Meduna had not been aware of Sakel’s work. In
his practice he noticed that schizophrenia and epilepsy were seldom
found in the same patient. He interpreted this observation as a sort of
“biological antagonism” between the two diseases of the brain. For him,
the next logical step was to try “artificial epileptic fits” as a treatment of
schizophrenia. He tried many substances to produce completely
controllable and reproducible convulsions. Ultimately, Meduna found
that intravenous injection of Metrazol quickly produced severe
convulsions in a dose-dependent manner. After a series of 110 cases,
Meduna reported improvement in 50% of the cases of schizophrenia.
By 1937, two camps of psychiatrists were firmly established—those
260 | History of Medicine

who defended insulin-shock therapy and those who favored treatment


by Metrazol convulsions. The convulsions produced by Metrazol were
often so violent that the patients suffered from fractures of the limb
bones or even spine. Eventually, the psychiatrists realized that the
theory of “biological antagonism” between convulsions and schizophrenia
was unfounded. By the middle of 1940s, Metrazol therapy was
abandoned in favor of a far more cruel therapy like prefrontal lobotomy.
The Electroconvulsive therapy (ECT) was devised by Ugo
Cerletti, an Italian neurologist in 1937. Cerletti was a believer in the
Metrazol convulsive therapy but found it too dangerous and once injected,
the convulsions were uncontrollable. The therapy was highly feared by
the patients. Cerletti had been researching on epilepsy and had frequently
used electric shocks to induce epileptic convulsions in experiment animals.
In order to produce convulsions in humans, Cerletti first experimented
on animals. When he found ideal electric parameters of safely producing
well- controlled convulsions, he tried the procedure on a series of patients
of schizophrenia. Most the patients showed startling improvement. In
1939, Cerletti began a world tour to advertise the electroconvulsive
therapy. The ECT was found still more useful in the treatment of severe
mental depression. With the use of muscle relaxants, ECT became still
safer. In 1950s and 1960s, ECT was routinely used in the mental hospitals
all over the world and remained so till psychotropic drugs became
available in 1970s.
Prefrontal lobotomy was the cruelest of all the modes of treatment
devised in the 20th century for the treatment of psychiatric disorders. It
left the patient totally emotionless and incapable of looking after himself.
With total lack of social and moral behavior, the family was as much in
agony as before the operation, though for different reasons. Thousands
of mentally ill patients were subjected to this cruel therapy in 1940s to
1960s. Prefrontal lobotomy was like the darkest hour before the onset
of dawn of psychotropic drugs.
20th Century Medicine | 261

Psychotropic drug treatment of the mentally disturbed patients


was probably the biggest triumph of the research in pharmacology after
the discovery of antibiotics. The psychotropic drugs brought a revolution
in the treatment of mental disorders. Nowadays, thousands of mentally
sick people are on treatment without others being aware of it. Being
psychologically disturbed is no more a social stigma.
The chemical revolution in the treatment of psychosis, especially
schizophrenia, began with the release of chlorpromazine in 1954.
Originally marketed as a tranquilizer, chlorpromazine was soon found
to be more effective in schizophrenia, especially subduing the
hallucinations and delusions. Within 8 months of its appearance in the
market, the drug had been administered to more than 3 millions patients.
With the use of chlorpromazine, most of the patients of schizophrenia
could leave the mental asylums to lead a normal life in the community. In
the USA, it came to be known as “the drug that emptied the state mental
hospitals.”
Chlordiazepoxide (Librium) was the first antianxiety drug to be
marketed in the 1960s. Anxiety is such a common, often debilitating,
problem that over decades, Librium has remained the top-selling
psychotropic drugs.
Depression is another common psychological ailment. In a given 6-
month period, about 3% of the adult Americans experience severe
depression. Because of its widespread incidence, depression is jokingly
called the “common cold” of psychiatric profession. In 1930s to the
1950s, ECT was the most effective treatment for severe mental
depression. In the mid-1950s iproniazid, a drug being tried for its
antitubercular activity was found to more helpful in the treatment of
depression (See: When tubercular patients danced with joy, page 404).
It was prescribed to 400,000 depressed patients in the very first year of
its release in the market. It was later withdrawn because of its side-
effects. Subsequently, tricyclic antidepressants and some other
antidepressants with very little side-effects have been marketed.
262 | History of Medicine

OTTO LOEWI
Otto Loewi (1873-1961) (Fig. 5.34) was a
German-American pharmacologist. His
discovery of acetylcholine as a
neurotransmitter was a great step in the
progress in neurophysiology. He was
awarded a Nobel Prize in medicine in
1936 in association with Sir Henry Dale.
Loewi graduated in medicine in 1896
and started practice as a physician. After
seeing a number of deaths due to incurable
diseases such as tuberculosis and
pneumonia, he felt the futility of medical
practice in the absence of any effective
therapeutic tool. He decided to give up
clinical work and devote his time to Fig. 5.34: Otto Loewi
pharmacology research. Beginning in 1898,
Loewi worked for a number of years in Austria. In 1921 Loewi published
the now famous experiment on the frog’s heart. At that time, it was
unclear whether the signaling across a synapse was bioelectrical or
chemical. Loewi’s experiment answered this question. According to him,
the idea for this key experiment came to him during sleep. He woke up
at 3 am and set up the experiment. He dissected out of frogs two beating
hearts, one with the Vagus nerve attached and the other without. Both
the hearts were bathed in Ringer’s solution, flowing from the innervated
heart to the denervated heart. When the Vagus nerve was electrically
stimulated, the attached heart slowed down as expected. To his
amazement, even the second heart also slowed down. Loewi concluded
that some soluble chemical substance released on stimulation of the
Vagus controls the heart rate. He called the unknown chemical substance
20th Century Medicine | 263

Fig. 5.35: Loewi experiment

“Vagusstoff,” which was later identified as acetylcholine (Fig. 5.35).


Loewi also demonstrated two mechanisms of therapeutic importance:
the blockade and the augmentation of nerve action by certain drugs.
From 1940 onwards, Loewi lived in the USA as research professor at
the New York University College of Medicine.

HENRY DALE
Sir Henry Hallett Dale (1875-1968) (Fig. 5.36) was an English
neuroscientist who was awarded Nobel Prize in medicine, 1936, in
association with Otto Loewi.
In 1940s, Dale was embroiled in the scientific debate over the nature
of signaling at the synapse. Dale supported the belief that the signaling
264 | History of Medicine

at the synapse was chemical in nature,


whereas John Eccles, another renowned
Australian neuroscientist, supported the
electric theory of transmission. After a bitter
and long drawn out struggle, Dale’s theory
prevailed.
Although Dale isolated the compound
he named “acetylcholine” from ergot in
1914, its possible occurrence in the body
was thought of only when Otto Loewi
reported the action of “Vagusstoff.” Dale
was able to demonstrate the release of Fig. 5.36: Henry Hallett
acetylcholine at the motor nerve endings in Dale
the somatic nervous system as well as in parasympathetic nerve endings.
(Sympathetic nerve endings had earlier been shown to release
norepinephrine). Dale also discovered the presence of acetylcholine in
preganglionic neurons of both sympathetic and parasympathetic nerve
endings. Dale also named the “muscarinic” and “nicotinic” actions of
acetylcholine. At that time only two neurotransmitters were known—
norepinephrine and acetylcholine. A neuron was found to release only
one of the two transmitters. Based on this observation, Dale classified
neurons as cholinergic or noradrenergic. He also laid down, what came to
be known as the Dale’s principle— a neuron and all its terminals can
release only one type of neurotransmitter. Dale’s principle was accepted
as true for many decades till many other neurotransmitters were
discovered. Then it came to be known that some neurons released more
than one neurotransmitter at the axon terminals.
20th Century Medicine | 265

HISTORY OF
ELECTRODIAGNOSTIC TECHNIQUES
The credit for taking the first step in electrophysiology is given to Luigi
Galvani, an Italian anatomy professor of 18th century. In 1771, in
order to prepare soup of frog’s muscle, Galvani’s wife had hung a recently
killed frog by a copper hook over an iron railing in their garden. Galvani
was astonished to observe that whenever the hook touched the iron
railing, frog’s muscles twitched. He incorrectly assumed that the legs
had what he called “animal electricity (Fig. 5.37),” which was released
on contact with iron. Static electricity had been invented by that time.
Following the observation on the frog mentioned above, Galvani
performed a number of experiments to elicit contraction of the Frog’s
muscle, by stimulating the nerve attached to it. Galvani’s conclusion
was that the brain generates electricity that is distributed through the

Fig. 5.37: Animal electricity


266 | History of Medicine

inner core of nerves down to the muscles making them to contract. The
report of Galvani’s experiment aroused great interest not only among
the contemporary scientists but even laymen. The demonstrations
became a high point of interest in social gatherings. It is believed that
thousands of frogs were killed to provide “scientific entertainment” to
the public.
Galvani’s conclusion was strongly disputed by an Italian physicist,
Alessandro Volta, who showed that the electric current was generated
by contact of two dissimilar metals (brass and iron) and not by animal
tissues. The controversy between the two scientists led Volta to develop
the first battery called Volta’s pile, consisting of a column of alternating
metal discs—zinc and copper—separated by a paperboard soaked in
saline. The fact is that both Galvani and Volta were correct in their
observations but wrong in their conclusions. It took another 200 years
to resolve the controversy.
The net outcome of the controversy was the widespread interest in
the public about the effects of then recently invented electricity on the
animals and humans. Giovani Aldini, a nephew and assistant of Luigi
Galvani, began to demonstrate “reanimation” of the dead by applying
electric current to recently hanged criminals in different towns. He would
apply a strong shock of current to the ear and the rectum. The result was
such a violent muscular contraction of the corpse that often it would
assume a vertical posture, giving an appearance of reanimation. Aldini
even started an electrotherapy clinic, where he claimed to treat various
disorders by giving electric shocks. He reported complete rehabilitation
of patients with mental disorders treated with transcranial electric shocks.
In 1843, a German Physiologist, Emil Du Bois-Raymond, developed
the most sensitive galvanometer of those days. His device had a wire
coil with 24,000 turns—5 km of wire. With this instrument, he detected
a small potential present in a resting muscle and noted that it diminished
20th Century Medicine | 267

with contraction of the muscle. The change in potential during contraction


was called “action potential” by him.
In 1850, Hermann von Helmholtz, a German physiologist, was
able to measure the propagation speed of an electrical signal in a nerve.
He reported that the nerve signals had a speed of “only” 10-100 meters
per second. The word only was used probably because he was expecting
the electric signals in a nerve to travel with the speed of electricity in an
electric wire.
In 1856, Rudolph von Koelliker and Heinrich Muller, the
German physiologists, applied a galvanometer to the base and apex of
the Frog’s heart and observed an electric current during each heart beat.
They also applied a nerve muscle preparation to the ventricle and observed
a muscle twitch just prior to each systole and also a much smaller twitch
after systole.

Electrocardiography
Willem Einthoven (1860-1927) (Fig.
5.38) was a Dutch physiologist who
invented the first practical electro-
cardiograph in 1902 (Fig. 5.39). For this
work, he was awarded the Nobel Prize
in medicine, 1924.
By the end of 19th century, it was
known that the beating of the heart
produced electric currents, but it could be
recorded only by placing the electrodes
directly on the heart. Starting from 1901,
Einthoven developed a series of more
sensitive galvanometers called string
galvanometers. This device used a very Fig. 5.38: Willem Einthoven
268 | History of Medicine

Fig. 5.39: The first electrocardiograph

thin filament of a conductive wire passing between very strong


electromagnets. When a current passed through the filament, the
electromagnetic field would cause the string to move. A light shining on
the string would cast a shadow on a moving roll of photographic paper,
thus forming a continuous curve showing movement of the string.
The original machine required water cooling for the powerful
electromagnets, required 5 people to operate and weighed 600 lb. The
device increased the sensitivity of the galvanometer to such an extent
that cardiac electric potentials could be recorded even from the body
surface, in spite of the resistance of the skin and other tissues of the
body. Einthoven assigned the letters P, Q, R, S, and T to the various
waveforms observed by him. These terms are still in use. After the
invention of the string galvanometer, Einthoven went on to describe the
electrocardiographic features of a number of cardiovascular disorders,
which were very helpful in their diagnosis.
20th Century Medicine | 269

Electroencephalography
Richards Caton (1842-1926) was probably the first to record the
spontaneous electric activity of the brain. In 1874, he published
experiments on the brains of dogs and apes in which unipolar electrodes
were placed on the exposed cerebral cortex. The currents were measured
on a sensitive galvanometer. There were found distinct variations in
current, which increased during sleep and ceased after death. Caton was
able to show that strong current variations occurred in the part of cerebral
cortex in a state of functional activity. For example, when light was
shown into the eyes of the animal, greater electrical activity could be
seen in the posterior pole of the brain.
Hans Berger (1873-1941) (Fig. 5.40)
was a German Psychiatrist, who developed
the first human electroencephalograph in
1929. It was a historical breakthrough
providing a new diagnostic tool for the
diagnosis of neurological disorders,
particularly, epilepsy and brain tumors.
The successful introduction of
electrocardiograph by Einthoven in 1902
inspired further research into electric
activity of the brain. Using a string galvano-
meter, Berger made the first EEG record in
1924 during a surgical operation on the brain
of a 17-year-old boy. Next Berger made Fig. 5.40: Hans Berger
73 EEG recordings from the scalp of his
15-year-old son. When the subject’s eyes were closed, there was a
consistent pattern of 10 cycles per second that was named Berger rhythm
(now known as the alpha rhythm). The frequency of waves was found
to increase suddenly whenever the subject opened his eyes or even did
270 | History of Medicine

some mental mathematical calculations. These waves were called the


beta waves.
The first EEG record was taken by Berger in 1924. After 5 years of
extensive work on the EEG records in normal individuals as well as
patients with different neurological disorders, he published the results
in a German journal of psychiatry. The discovery of EEG was a milestone
in the neurodiagnostic techniques but it was completely ignored by the
medical fraternity of the world. Partially, the reason was that, it was
published in a journal of psychiatry that was not taken seriously by the
research workers in neurology. Secondly, it was due to the style of
Berger’s research. Berger’s experiments were performed in his spare
time in utter secrecy. Even his laboratory, situated in small building
below his clinic was not known to anyone. Due to his arrogance and
inflexibility, Berger was shunned by the colleagues. Berger’s work gained
international recognition only when his work was replicated by Adrian,
famous British neurophysiologists and Nobel Laureate, in 1934. In
1937, Hans Berger was invited to preside at the symposium on electric
activity in the nervous system at the Congress of Psychology in Paris.
They hailed Berger as the most distinguished of all the visitors. Tears
came to his eyes as he said: “In Germany I am not so famous.”
Actually, in his own country, Berger was not liked by the Nazis. In
1938, he was forced to retire from his job and research on EEG was
banned throughout Germany. He went into severe mental depression
and committed suicide in 1941.

WERNER FORSSMANN
THE FIRST CARDIAC CATHETERIZATION

Werner Forssmann (1904-1979) (Fig. 5.41), a German physician,


performed the first human cardiac catheterization in 1929. For this
20th Century Medicine | 271

work, he shared the Nobel Prize in


medicine, 1956, with two American
physicians.
Experimental cardiac catheteri-
zation was performed in animals by
Claude Bernard in 1844. Adolph Fick
(1870) also performed animal cardiac
catheterization to find out the rate of
blood flow in different organs and
formulated the “Fick principle.”
In 1929, Forssmann was a medical
intern. He was interested in finding a
method to inject drugs directly into
the heart. His experiments on human Fig. 5.41: Werner Forssmann
cadavers showed it to be an easy
procedure. A urology catheter could be inserted into a vein in the forearm
and pushed into the venous system till it reached the right atrium of the
heart. He could not dare to ask any human subject to volunteer for such
a supposedly dangerous procedure. Therefore, he requested a fellow
resident friend to push the catheter in his (Forssmann’s) vein. When the
catheter was about one foot into the venous system, the friend panicked
and fled from the scene. One week later, Forssmann himself inserted a
catheter into his own vein. Standing behind an X-ray screen, he pushed
the catheter into the venous system himself. A nurse helped him by
holding a mirror in front of the X-ray screen. The X-ray revealed the
position of the tip of the catheter. When the tip reached the right atrium
of the heart, Forssmann wanted a documentary proof of the experiment.
Therefore, with the catheter tip in the heart, Forssmann walked into the
X-ray room for a radiograph. Instead of taking an X-ray, the technician
rushed out to alert another doctor. The colleague was shocked to see
what Forssmann had done and tried to pullout the catheter. Forssmann
272 | History of Medicine

had to give him a few kicks on the shin to prevent him from doing so and
had the X ray picture taken. Forssmann repeated the procedure on
himself eight times during the next two years. He even injected a
radiopaque dye into the heart through the catheter. With the radiological
evidence; he published the results in a scientific journal. He met with
such a harsh criticism that he left the hospital and opted to practice
medicine in a small village.
Years later, two American physicians, Andre-Frederic Cournand
and Dickinson W Richards read the research paper published by
Forssmann and tried to use his method to take samples of the blood
from different chambers of the right heart. After experimenting on live
dogs and chimpanzees for 5 years, they felt confident enough to try the
procedure in humans in 1941.The procedure was found to be absolutely
safe. From the catheter tip, samples of blood could be obtained from the
right atrium, right ventricle or the pulmonary artery. Even the pressure
could be measured in any of these chambers. The right heart
catheterization proved highly useful in accurate diagnosis of various
cardiac valvular defects. For this pioneering work, Forssmann, Cournand
and Richards were jointly awarded the Nobel Prize in medicine, 1956.
To Forssmann, still a small time village doctor, the news came as a
stunning but pleasant surprise.
By 1947, the left heart catheterization was developed by Dr
Zimmerman. The technique, later, led on to the procedures for coronary
angiography and coronary angioplasty.

PIONEERS IN HEART SURGERY


As mentioned elsewhere, wars have been instrumental in making great
strides in the field of surgery. Army surgeons saw such devastating
injuries that, in the face of sure death, any new desperate measure was
considered acceptable. Cardiac surgery owes its birth to World War II.
20th Century Medicine | 273

Of course, by that time routine surgery had become absolutely safe. But
to operate on a beating heart, with its chambers full of blood, was
considered impossible. An American army surgeon, Dr Dwight Harken
was receiving young soldiers from the European front with bullets or
shell fragments lodged in the heart. To leave the shrapnel it was dangerous
but to surgically take it out was most likely to be fatal. Dr Harken tried
operating on dogs by giving a deep incision in the wall of the heart and
resuturing it as early as possible. All of the first 12 animals died. Of the
second group of 14 animals, half died. With more experience, the mortality
in experimental animals came down to 2 out of a group of 14. Now Dr
Harken was ready to try the technique on humans. All of the soldiers
operated by him to remove the shrapnel from the heart survived.
The next logical step was the surgical treatment of mitral stenosis.
In 1948, two American surgeons, Dr Dwight Harken and Dr Charles
Baily, independently gave a small incision in the wall of the heart and by
inserting a finger, the narrow mitral valve was forcibly dilated. Initially
all the patients died. But, within months, the operation became
absolutely safe and the closed heart surgery for mitral stenosis began to
be done all over the world. This procedure required stoppage of
circulation for about four minutes. Any prolonged stoppage of circulation
was likely to produce permanent hypoxic brain damage. Therefore, any
other type of cardiac surgery was not possible.
The next step, open heart surgery, became possible by the efforts
of a Canadian surgeon, Dr Bill Bigelow. He noticed that hibernating
animals had very slow heart rate but survived many months of bitter
winter without food. Therefore, Dr Bigelow started producing varying
degree of hypothermia and investigate the duration for which the
circulation could be stopped. He learnt that, if the body temperature
was lowered to about 80°F, the circulation could be stopped for about
10 minutes without any danger to the brain. This duration was considered
safe for the surgical treatment of many valvular defects. In September,
274 | History of Medicine

1952, two Canadian surgeons, Dr Walton Lillehei and Dr John Lewis,


performed the first open heart surgery on a 5-year-old girl with a
congenital hole in the heart. The operation was a success. By 1958,
heart-lung machines came into use. These machines temporarily took
over the pumping function of the heart as well as the gas exchange
function of the lungs. Therefore the heart could be operated upon for far
longer duration without having to stop the circulation of blood. With the
advent of hypothermic technique and heart-lung machines, the cardiac
surgery came of age.

CHRISTIAAN BARNARD
Christiaan Barnard (1922–2001) (Fig.
5.42) was a world famous South African
cardiac surgeon. He performed the first
heart transplant on a human in 1967. In
his own words: “On Saturday, I was a
surgeon in South Africa, very little
known. On Monday, I was world
renowned.”
Barnard was the son of a poor
Afrikaner preacher. He walked five
miles each day to study at the Cape
Town University, before becoming a
family physician. In 1956, he went to Fig. 5.42: Christiaan Barnard
the USA to learn the latest techniques
in cardiothoracic surgery. On his return to South Africa, he was appointed
cardiothoracic surgeon at the Groote Schuur Hospital in Cape Town and
established the hospital’s first heart surgical unit. Barnard performed
the first kidney transplant in South Africa in 1959.
20th Century Medicine | 275

By the year 1967, Barnard was a highly respected cardiothoracic


surgeon of South Africa. For a number of years he was experimenting
with heart transplant surgery in animals. He shot into world fame when
he performed the first human heart transplant on third December, 1967.
One of his patients, Louis Washkansky, 55-year-old man, terminally ill
with a heart disease agreed to undergo the pioneering operation whose
success was not sure even to the surgeon. The heart to be transplanted
came from a 25-year-old female victim of an automobile accident. The
operation lasted nine hours and required a team of 30 persons. The
patient survived only 18 days but the historical event made the surgeon
a household name all over the world. Most of the subsequent heart
transplant patients of Barnard did not live long after the operation and
the technique was put on hold for some years. The problem was that the
immunosuppressant drugs used to avoid the graft rejection interfered
with the patient’s ability to fight against infections. In 1974, a new drug
called cyclosporine was discovered in Norway which protected the
body against graft-rejection without interfering with his immunity against
infections. In all, Christiaan Barnard performed 49 transplants during
the period 1967–1983. After this, he retired from surgery because of the
deformities in his hands due to rheumatoid arthritis.
In his personal life, Christiaan Barnard was a handsome and colorful
personality. He spent as much time in the night clubs as in the operation
theater. After the historic event, his time was spent jet setting around
the world meeting famous people, from princes to kings, American
President and even the Pope. He was frequently found in the company
of Hollywood film stars. Sophia Loren and Gina Lollobrigida were two
his many such friends. Due to his photogenic face and the association
with film stars, Barnard came to be known as “film star surgeon”. He
married three times but each married ended in divorce, mostly because
of his philandering. In 2001, he died of an acute asthmatic attack.
276 | History of Medicine

DISCOVERY OF ANTICOAGULANTS
Hirudin was the first anticoagulant to be discovered. It was extracted
from leeches and used to prevent coagulation of blood during experimental
and clinical hemodialysis for renal failure. However, it was expensive,
difficult to extract and produced serious allergic and other cardio-
respiratory complications. Therefore it was found unsuitable for human
use.
Heparin, the first clinically useful
anticoagulant, was discovered accidentally
by an American second year medical
student, Jay MacLean. MacLean was the
son of a surgeon, who died when the child
was only 4 years of age. He supported
himself as a laborer before joining the John
Hopkins Hospital as a medical student in
1915. As a second year student, he was to
undertake a short research project. William
Henry Howell (Fig. 5.43), the professor
of physiology, advised him to try to isolate
a thromboplastin, a procoagulant, from the Fig. 5.43: Howell
liver extract. The substance he isolated had
no thromboplastic activity. Instead it showed a marked power to inhibit
coagulation. Howell refused to accept the finding, because it went against
his theory of coagulation. MacLean graduated in 1919 and became an
assistant of Halsted, the famous American surgeon working in the same
hospital. Meanwhile, in 1918 and 1920, Howell published research
papers showing the anticoagulant properties of the liver extract and
named it heparin. In both these papers and all other subsequent
publications, the contribution of MacLean was never acknowledged.
MacLean launched a campaign to claim credit for the discovery of heparin.
20th Century Medicine | 277

But, he had no chance against the powerful and renowned researcher; it


was Howell, who received all the acclaim for the discovery of heparin.
Subsequently, it was found that the anticoagulant could be extracted
from many other tissues, especially the lungs and the intestine, but the
term heparin was retained. Charles Best, the Canadian physiologist was
responsible for the purification and standardization of pure heparin in
1930s. Initially it was used only in patients on artificial kidney. Its
importance in the prevention of deep vein thrombosis and coronary
thrombosis was realized only in 1940s.
Dicoumarols, the orally effective
anticoagulant was discovered by a
Canadian veterinary surgeon, Frank
Schofield (Fig. 5.44). In 1921, he was
investigating the cause of outbreaks of a
hemorrhagic disease in the cattle. The
disease was found in two forms, the
anemic form and the hemorrhagic form.
The anemic form was seen in calves that
had been dehorned. The horn stumps bled
so profusely that the calf died of severe
anemia. The hemorrhagic form was seen
in mature cows that developed huge
hematomas on the body or suffered from Fig. 5.44: Frank Schofield
internal hemorrhages. In those days,
hemorrhages were common complication of severe infections. Schofield
was quick to realize that the affected animals did not show any fever or
toxemia, the common features of severe infection. He went directly to
the farms where the cattle were housed. He found that the animals were
fed sweet clover, a coarse plant which had become mouldy. He spent
time and money on sweet clover research and conclusively proved the
role of the mould in the causation of the hemorrhagic disease in the
278 | History of Medicine

cattle. However, no farmer believed him because cattle had been often
fed spoiled feed with no ill effects. It took another 25 years to realize
that the anticoagulant formed in the mould could be useful to humans.
Frequent outbreaks of the hemorrhagic disease in the cattle maintained
the interest of veterinary research workers. In 1929, a veterinary
pathologist showed the condition was due to deficiency of prothrombin,
the plasma protein essential for the
coagulation of blood. It was only in 1940
that the identity of the anticoagulant
substance found in the mouldy clover was
established as Dicoumarols, by two
American chemists, Dr Link (Fig. 5.45)
and his student Campbell. They also
showed that the anticoagulant activity of
Dicoumarols was due to their antivitamin
K action. Link was quick to realize the
possible use of Dicoumarols in thrombotic
disorders. Since the dosage was not
established, use of the drug initially resulted
in bleeding problems in some of the Fig. 5.45: Link
patients. Therefore, in a leading article in a medical journal, Dicoumarols
were denigrated as a “drug fit to be used as rat poison than in humans.”
Link thanked the writer for the comment and launched the drug as a
powerful and safe rodent killer under the name of Warfarin.
In 1951, a naval enlisted man tried to commit suicide with warfarin
but could be saved by timely administration of the antidote, a large
intravenous dose of vitamin K. With the safety of Dicoumarols and
Warfarin established, studies began in their use in humans. The benefit
of the drug received wide publicity in 1955 when it was used on Dwight
Eisenhower, president of the USA subsequent to his heart attack.
20th Century Medicine | 279

GERTRUDE ELION
Gertrude Belle Elion (1918-1999) (Fig.
5.46) was an American biochemist and
pharmacologist, who is remembered for the
discovery of a number of life-saving drugs.
She was awarded a Nobel Prize in
medicine, 1988.
Elion’s parents were Jewish immigrants
in the USA. Her father was a dentist
practicing in New York and her mother a
seamstress. Her father had invested all his
money in the stock market. As a result of
the American stock market Crash in 1929,
her father became bankrupt. His dental Fig. 5.46: Gertrude
practice was just sufficient to make the both Belle Elion
ends meet, but there was no money for her college education. Luckily
for her, Hunter College offered tuition-free education to girls and she got
admitted. Elion earned her Bachelor’s degree in chemistry at the age of
19. In spite of her meritorious undergraduate career, she could not get
financial aid from any of the 15 colleges she applied to pursue the
Master’s course in chemistry. She tried to get the job of a laboratory
assistant in science colleges, but was refused because “they never had a
woman in the laboratory before and her presence was likely to be a
distracting influence on the male students.”
For the next seven years, Elion worked on a number of jobs, mostly
unrelated to her qualification in chemistry; like a doctor’s receptionist,
teaching chemistry and physics in high school, teaching biochemistry to
nursing students, sales girl in a store, and even checking acidity of
pickles or mould on fruit. She got these odd jobs because men were not
available during World War II. She saved some money to pay for her
280 | History of Medicine

education and worked for her master’s degree at night and on weekends.
She earned Master’s degree in 1941, but could not get any suitable
employment. It was only in 1944 that she was offered the job of senior
research chemist by Burroughs Wellcome Pharmaceuticals. She was one
of the only two women among the laboratory staff of 75. She was
assigned to work under George Hitchings, a pharmacologist. For the
next 30 years the collaboration of Hitching and Elion developed a number
of life-saving chemotherapeutic agents like 6-mercaptopurine (for
leukemia), Azathioprine (an immunosuppressive agent for organ
transplantation), Allo-purinol (for gout), Trimethoprim (for bacterial
infections) and Acyclovir (for viral herpes). In 1988, both Hitching and
Elion were awarded Nobel Prize in Medicine. In 1991, Elion became the
first woman to be inducted into the National Inventors Hall of Fame.
Elion never married. In her own words, she was not against marriage.
In fact, in early 1940, she planned to marry a young man, but he died of
a bacterial infection in the heart, only two years before penicillin was
marketed. Subsequently, she did not marry because “ in those times,
American women had to choose between a career and a family life; she
chose the former.”

THE THALIDOMIDE DISASTER


Thalidomide was a drug manufactured by a German pharmaceutical
company in late 1950s and sold as a sedative as well as an antiemetic to
prevent morning sickness in early months of pregnancy. Besides Germany
and Britain, the drug was available in 46 countries of the world, but not
in the USA. Within 2–3 years of the launch of thalidomide, its teratogenic
effect on fetal development came to light. It is believed that during this
short period, some 15,000 fetuses were damaged by thalidomide, of
which 12,000 were born with most hideous birth defects. Babies were
born with cleft lips, cleft palate, blindness or phacomelia—the severe
20th Century Medicine | 281

Fig. 5.47: Thalidomide baby Fig. 5.48: Thalidomide adult

deformities of the limbs. These included the trunk lacking either arms,
legs or both; some even had flappers extending from the shoulders or
toes extending directly from the hips (Figs 5.47 and 5.48). Many of the
babies died within one year of birth, but about 8000 of them are still
alive. Lawsuits were filed and the manufacturers of thalidomide were
made to pay heavy monetary compensations. However, if one looks at
a thalidomide adult, no amount of money would seem adequate for the
deformities produced by the drug.
Further investigations revealed that the basic problem was inadequate
testing procedure employed before the drug was put up in the market.
The drug was tested for any toxicity only in the rat. When tested later in
the pregnant rabbits and mice, the teratogenic effect was obvious. Had
the greedy pharmaceutical company carried out the drug trial in different
species of animals in varying parts of life cycle, the drug would not have
been approved for use in pregnant women.
In the USA, a stricter policy had already been adopted for allowing
the sale of any new drug in the market. Dr Frances Kelsey, in the
282 | History of Medicine

department of Food and Drug Administration, demanded some more


documentary proof of the safety of thalidomide. Before these objections
could be cleared, the teratogenic effects of the drug began to appear in
the medical journals. Therefore permission for the sale of thalidomide in
the USA was never granted. Thus, what the drug companies were calling
a “bureaucratic delay” saved the United States from the thalidomide
disaster. Dr Kelsey became a national hero. She was awarded the
President’s Award for Distinguished Federal Civilian Service (at that
time the highest civilian award in the US) by President John F Kennedy,
in 1962.
Though thalidomide was withdrawn from the market in 1962, it
was later found to be effective in the treatment of some very serious
disorders like multiple myeloma and leprosy. Now the drug is on sale
again, specifically for the treatment of these disorders.

HISTORY OF TOBACCO
SMOKING AND LUNG CANCER
Tobacco smoking was common in many Native American cultures. In
1492, when Christopher Columbus landed in the New World, the
native Red Indians presented him some fruits and some “strongly smelling
dry leaves.” Columbus and his crew ate the fruit but threw away the dry
leaves. Rodrigo de Jarez, a Spanish sailor is believed to be the first to
have brought tobacco leaves to Europe. In his home town in Spain, the
neighbors were so frightened to see smoke billowing from his mouth and
nostrils that they alerted the police. Jarez was imprisoned for seven
years, but by the time he came out, smoking had become fashionable,
mainly among the sailors.
In the 19th century, cigar became very popular among the rich.
Cigarettes, which were basically made of the sweepings off the floor of
the cigar factory, were smoked by the poor. By the beginning of 20th
20th Century Medicine | 283

century, smoking a cigar was considered one of the “characteristics” of


the high and mighty. Cigarette smoking became widespread when some
American companies started advertising it as a part of a glamorous
lifestyle. Another cause of the increasing popularity was the movies, in
which the famous film stars, both male and female were shown smoking.
The real boost to the sale of cigarettes was given by the success of
Tobacco companies in having their product included in the military
rations during the World War I. The soldiers being under stress took to
smoking, becoming habitual smokers. By the middle of World War I,
supply of cigarettes was considered by the generals as important as the
supply of the bullets. Almost the entire American army returned home
addicted to cigarette smoking. During the World War II, the sale of
cigarettes was all-time high. And by that time, the incidence of carcinoma
lung began to rise.
Search through medical literature reveals a total of 100 cases up to
the year 1900. A review of 100 years of autopsies in a hospital in
Germany in 1952 showed that the incidence of lung cancer had gone up
from 0.3 to 5.66 percent. There seems to be a time lag of approximately
30 years between the onset of smoking and the development of cancer.
A parallel increase in the prevalence of smoking and lung cancer in 20th
century can be demonstrated (Fig. 5.49). In the middle of 20th century,
lung cancer in women was practically unknown. In 1980s, it became
number one cause of cancer deaths in women.
In 1952, on behalf of the American Cancer Society, Dr Hammond
started a statistical study of 187,776 volunteers. Forty-four months
later, it was found that the total death rate in smokers from various
causes was twice that in non-smokers. Heavy cigaret smokers had
approximately 9-times the death rate from lung cancer compared to
those who never smoked. The report attracted worldwide attention and
the sale of cigarettes nose-dived. However the cigarette lobby accused
Dr Hammond of scientific incompetence, statistical jugglery, and a bias
284 | History of Medicine

Fig. 5.49: Smoking and cancer incidence

against cigarettes. Next, followed the cigarette industry’s most expensive


advertisement campaign. As a result, the sale of cigarettes rose to a new
high.
In response, Dr Hammond launched even a bigger investigation in
1959. As many as 1,057,398 men and women were enlisted and medically
studied for the next 6 years. The study revealed not only a very high
incidence of lung cancer but also cancer of the mouth, pharynx, and
esophagus among the smokers. In addition, the smokers showed much
higher incidence of cardiovascular disorders, and emphysema. Still, the
cigarette lobby refused to be convinced. They challenged Dr Hammond
to provide an experimental evidence of the association between cigarette
smoking and lung cancer.
Dr OscarAuerbach, a pathologist friend of Dr Hammond, chose to
undertake such a study. He purchased 480,000 cigarettes and 97 dogs
were made to inhale the cigarette smoke through a tracheotomy in their
20th Century Medicine | 285

throats. In the beginning, the dogs used to struggle to get into the smoking
chambers. However, within a week or so, the dogs became habituated to
the cigarette smoke. They would to jump happily into the smoking
chambers wagging their tails. After about 30 months, the dogs were put
to death. Postmortem revealed a very high incidence of precancerous
changes in the respiratory mucosa of the smoker-dogs. No such changes
were found in the control dogs. The carcinogenic effect of cigarette
smoke was beyond doubt. It has taken another 30 years for the
antismoking campaigners to convince the governments to ban smoking
in public places, so that at least nonsmokers are not harmed by passive
smoking.

HISTORY OF CANCER
Cancer is one of the most dreaded diseases. The oldest description of
human cancer is found in the Egyptian papyri written between 3000-
1500 BC. It referred to the tumors of the breast and the efforts to treat
them by cauterization (“the fire drill”). The writings say about the
disease, “there is no treatment,” which is almost true even today. Cancer
has been found in the fossilized bones of human mummies in ancient
Egypt.
The origin of the word cancer is credited to the Greek physician,
Hippocrates (460-380 BC). He used this term and carcinoma to describe
the malignant tumors. In Greek, this word refers to a crab, an apt name
because of the finger-like blood vessels spreading from the tumors look
like the claws of a crab (Fig. 5.50).
In 1761, the Italian Pathologist Giovanni Morgagni started
performing the autopsies and relate them to the patient’s illness before
death. Now cancer could be diagnosed, though postmortem. The surgical
treatment of cancer was started in 18h century, when the famous Scottish
Surgeon John Hunter (1728-1793) suggested that some cancers can be
286 | History of Medicine

surgically cured if the nearby


tissue was not invaded. “If the
tumor is moveable, there is no
impro-priety in removing it,” he
wrote. The 19th century saw the
birth of scientific oncology with
the invention of the modern
micro-scope and its use by
Rudolf Virchow, the founder of
cellular pathology. At the same Fig. 5.50: Crab
time Stephen Paget proposed
the “seed and soil theory” for the metastatic spread of cancer. He
theorized that the metastatic tumor cells are like seeds, evenly distributed
throughout the bloodstream, but grow only in the tissues (soil) they
found compatible. In 1880s, Williams Halsted, an American surgeon
devised an extensive operation for the treatment of cancer breast,
consisting of removal of the breast, underlying muscles and the axillary
lymph nodes. The operation called radical mastectomy, achieved an
unprecedented 72 percent five–year survival of the patients. Subse-
quently, many surgeons devised operations for early localized cancerous
lesions, but most of the time, the patients reached the doctor too late.
Besides surgical removal of the cancerous growth, there was no
other treatment of cancer till the discovery of radium by Nobel laureate
husband-wife team, Pierre Curie and Marie Curie, in 1906. Soon,
radiation therapy became an important tool in the treatment of cancers.
The first drug to be used against cancer was nitrogen mustards.
Actually it was developed as an agent of chemical warfare and used in
World Wars I and II. Postmortem studies of soldiers exposed to nitrogen
mustards revealed very low white blood cell count and profound
lymphoid and bone marrow suppression. It was reasoned that since the
agent damaged the rapidly growing white cells, it might have a similar
20th Century Medicine | 287

effect on the cancer cells. Therefore in 1940s, many patients with


advanced lymphomas were treated with nitrogen mustards. The agent
was effective, though temporarily, but it started search for other more
effective chemotherapeutic agents against cancer.
Shortly after World War II, another approach to drug therapy in
cancer began. Folic acid, a vitamin discovered in 1937 was found to be
essential for synthesis of DNA. Since cancer cells are rapidly dividing
cells, they might require folic acid for their proliferation, With this idea
in mind, folic acid was administered in some children suffering from
acute lymphatic leukemia. The result was a sudden acceleration in the
proliferation of the leukemic cells. Thus began a search for folic acid
analogues which could block the function of folate-requiring enzymes.
Aminopterine and later methotrexate were found to be efficient
antifolates. Methotrexate was found to produce remission in childhood
leukemia without the toxic side-effects seen with nitrogen mustards.
Anticancer chemotherapy and radiotherapy are now firmly
established modes of treatment of cancers. The latest tool under trial
against cancer is the “smart-bomb immunotherapy.”
Anticancer vaccine is the first of its kind as a preventive measure
against cancer cervix. It has been approved for sale in June, 2006. Next
to cancer breast, carcinoma cervix is the most common malignancy in
women. About half a million women are diagnosed with this disorder
every year. It has been proved that the cancer cervix is caused by genital
infection with human papilloma virus during sexual contact. By
preventing the infection with the virus, the vaccine is likely to eliminate
cervical cancer within a generation.

HANS SELYE
Hans Selye (1907-1982) (Fig. 5.51) was a Hungary–born Canadian
scientist, who coined the term “stress” and the put forth the idea of
General Adaptation Syndrome.
288 | History of Medicine

In 1934, Selye was a research assistant


in the biochemistry department of McGill
University, Montreal, Canada. He was
trying to discover a new sex hormone.
Injection of ovarian extracts in the rats was
found to produce in each case, enlargement
of adrenal cortex, atrophy of lymph nodes
and deep ulcers in the stomach. His
jubilation was short-lived because soon he
found that similar results could be obtained
not only by injection of any other tissue
extract but even by injection of the toxic
tissue preservative, formalin. With the idea Fig. 5.51: Selye
of a new hormone evaporated, Selye
decided to see the results from a new angle. He came to the conclusion
that any type of injurious stimulus produced a common pattern of body
reactions. He recalled his student days in Prague when he observed a set
of symptoms and signs in all the patients, whatever the underlying
cause. All the patients looked and felt ill, had a coated tongue, complained
more or less of pains in the joints, intestinal disturbances, loss of appetite
and fever. Specific and diagnostic signs appeared only later. At that time,
Selye tried to interest his teachers and fellow students into the new
syndrome of “just being sick”, but no one bothered. Now, he felt that all
the experimental rats were suffering from “just being sick.” or a
nonspecific response to any assault on the body.
Hans Selye continued with his researches and came out with the
discovery of hypothalamus-pituitary-adrenal axis in the body response
to stress. Three stages of body response were identified by him, namely,
(i) the alarm reaction, in which the body prepares for fight or flight
reaction,(ii) the stage of adaptation and (iii) the stage of exhaustion. The
triad of responses was named by Selye as General Adaptation Syndrome
(GAS).
20th Century Medicine | 289

Since the existence of stress could now be easily detected by the


estimation of pituitary hormone (ACTH) and or adrenal hormones
(adrenal corticoids) in the blood, large number of stressors was soon
found to affect human body. According to Selye: “The beggar who
suffers from hunger and the glutton who overeats, the little shopkeeper
with his constant fear of bankruptcy, and the rich merchant struggling
for yet another million, are all under stress. The mother who tries to
keep a child out of trouble and the child who scalds himself with hot
water, both are under stress.” This quotation taken from the introduction
of Hans Selye’s book “The Stress of Life” sounds more like a part of
lecture of a saint rather than a scientist, since Selye was a mixture of
both. He became a messiah of feel-good messengers of health. He often
spoke of the value of love in human life, and the essential importance of
our own well-being by helping others. Selye published 33 books and
over 1600 research articles, almost all of them on the subject of stress.
Dr Selye served as Professor and Director of the Institute of Experimental
Medicine and Surgery at the University of Montreal from 1945 until
retirement in 1970.

SOME FAMOUS NEUROPHYSIOLOGISTS


Sir Charles Sherrington (1857-1952) (Fig.
5.52) was a famous British neurophysiologist,
who made extraordinary contribution to the
understanding of the function of the nervous
system. He shared the Nobel Prize in Medicine,
1932, with another neurophysiologist, Edgar
Adrian.
Sherrington’s research, spanning over more
than 50 years, laid the foundation for modern
neurophysiology. Sherrington’s biggest contri- Fig. 5.52: Sir
bution was the study of various postural reflexes. Charles Sherrington
290 | History of Medicine

He discovered that any external stimulus, without the cooperation of


will, calls forth a definite response, such as contraction of certain muscle.
Even a voluntary movement is possible only in the background of
involuntary contraction of a large number of postural muscles. He also
discovered the phenomenon of reciprocal innervation—contraction of a
group of muscles of a limb is accompanied by reflex inhibition of the
antagonist muscles. Working on decerebrate cats, dogs and monkeys, he
found that reflexes must be considered integrated activities of the total
organism, not just the result of simple reflex arc responses at the level of
a segment of a spinal cord.
He developed the theory of transmission at the synapse being in
one direction and originated the term “synapse”. Sherrington made
extensive research on the sensory physiology. He coined the terms,
“exterioceptors,” “proprioceptors “and “visceroceptors”.
Besides his scientific work, Sherrington was a man of wide interests:
biographer, medical historian, poet, book collector and sportsman. As a
youngman, Sherrington was notable athlete who excelled in rowing and
football.
Edgar Douglas Adrian (1889-1997) (Fig.
5.53) was a famous British neurophysiologist
who was awarded the Nobel Prize in
Medicine, 1932, (shared with Sherrington)
for his path-breaking research in
neurophysiology.
For the first time, Adrian, using cathode
ray oscilloscope and a microelectrode, was
able amplify the electric activity in a nerve
5,000 times and record the impulse discharge
in a single fiber. He describes the first
experience as follows: “I had arranged Fig. 5.53: Edgar
electrodes on the optic nerve of a toad in Douglas Adrian
20th Century Medicine | 291

connection with some experiments on the retina. The room was nearly
dark and I was puzzled to hear repeated noises in the loud speaker
attached to the CRO; noise indicating that a great deal of impulse activity
was going on it. It was not until I compared the noises with my own
movements around the room that I realized that I was in the field of
vision of the toad’s eye and that it was signaling my movements.”
In subsequent studies, Adrian demonstrated the phenomenon of
adaptation of sensory receptors—when a stimulus is constantly applied,
the impulse discharge from the receptor gradually declines. He also
demonstrated the most of the impulses originating from the pain afferents
terminated in the thalamus.
Adrian demonstrated the sensory homunculus, showing that part
of the sensory cortex devoted to any part of the body depends on the
degree of biological needs of the part in the animal. For example, he
found that in humans and monkeys, the part of sensory cortex devoted
to the face and hand is very large but very little is devoted to the trunk
or the body. In the pig, the practically the whole sensory area is devoted
to the snout, the part of the body a pig uses to explore its environment.
Sir John Carew Eccles (1903-1997)
(Fig. 5.54) was an Australian neurophysio-
logist who was awarded Nobel Prize in
Medicine, 1963 for his fundamental
contribution to the ionic mechanisms of the
synaptic transmission in the brain and his
outstanding contributions to the integrative
function of the neurons in the spinal cord,
hippocampus and cerebellar cortex.
After his graduation in medicine from
the University of Melbourne in 1925,
Eccles was offered a scholarship to work in
neurophysiology in London under Charles
Sherrington. Following Sherrington’s Fig. 5.54: Eccles
292 | History of Medicine

retirement in 1937, Eccles returned to Australia as a director of an


Institute of Pathology and continued neurophysiologic research
independently. Eccles pioneered the use of intracellular microelectrodes
to record the electric activity in the spinal motor neurons. Eccles used
the simple stretch reflex arc as a model. On stimulation of a single
sensory nerve fiber in the dorsal nerve root, Eccles was able to record a
small excitatory postsynaptic potential (EPSP) in the spinal motor
neuron. Stimulation of some other sensory neuron produced a small
inhibitory postsynaptic potential(IPSP). An action potential was fired
in the spinal motor neuron only by summation of a number of EPSPs
These observations led Eccles to abandon his stoutly held belief that the
synaptic transmission was electric in nature. The observations on EPSPs
and IPSPs convinced him about the chemical nature of synaptic
transmission. Subsequently, he discovered acetylcholine as the
neurotransmitter in the spinal cord synapses. Following his retirement
in 1968, Eccles moved to the United States as Distinguished Professor
of Physiology and Biophysics at the State University of New York.
Here he made important contributions to the physiology of the various
types of cerebellar neurons.

WILLEM J KOLFF
Willem J Kolff (born 1911) (Fig. 5.55) is a Dutch-born American
physician who is considered father of dialysis. He invented the first
“artificial kidney” or the dialysis machine in 1943. He also invented
many other life-saving devices, including the intra-aortic balloon pump
in 1967 and an artificial (mechanical) heart in 1981.
Kolff was born in Netherlands and received his MD
degree in 1938, from University of Leiden. Soon after graduation, Kolff
started work in the University’s department of medicine under a
20th Century Medicine | 293

sympathetic professor of medicine, Polak


Daniels. At that time, he had a painful and
frustrating experience when a young man
suffering from kidney failure was admitted
to his care. Kolff watched the patient die
helplessly, since no treatment of kidney
failure was available at that time. Kolff
wished he had some means of removing
the poisons which accumulate in the body
in the absence of normal kidney function.
In 1940, Kolff founded a blood bank in his
hospital, which was the first in the
European continent. By that time his Fig. 5.55: Kolff
country was invaded and occupied by German army. Being Jews, Polak
Daniels and his wife committed suicide rather than suffer at the hands of
Nazi forces. Kolff was shifted to a municipal hospital in a small town of
Kampen. The first “artificial kidney” was developed in that small
hospital.
The concept of artificial kidney was originally conceived by an
American pharmacologist, John Abel. In 1913, he added aspirin to
dog’s blood and funneled it through a series of tubes of porous material
placed in a rinsing fluid chemically resembling blood plasma. He was
able to draw aspirin out of the blood, but the coagulation of blood was
a big problem. He used leech heads to prevent clotting, which was not
practical for human use.
By 1938, heparin was available as an effective and safe anticoagulant.
Moreover cellophane had been invented. This material was basically
used for wrapping food articles like sausages. Other researchers had
shown the cellophane was a semipermeable membrane which allowed
diffusion of small molecules from an area of high concentration to an
area of low concentration. Kolff decided to harness this property of
294 | History of Medicine

cellophane to draw out poi-


sonous chemicals out of the
blood of patients with kidney
failure. Since Netherlands was
occupied by Nazis, life was
tough for all the Dutch. Still Kolff
continued for the invention of an
artificial kidney (Fig. 5.56). In
1942, he persuaded the managing
director of a local enamel factory
to construct an enamel tub of the Fig. 5.56: First artificial kidney
size in which a horizontal drum could be rotated. Kolff wound 100 feet
of cellophane tube in a spiral fashion around a large drum which revolved
in the enamel tub containing a rinsing fluid resembling human plasma.
Most physicians had little faith in the Kolff’s invention. As a result
they would send only terminally ill, comatose patients of kidney failure
to Kolff for dialysis. He could not save any of the first 15 patients of
kidney failure.
In 1945, the first patient to be saved by the artificial kidney was a
67-year-old woman. Kolff was pressurized by his colleagues not to
treat this patient since she was a traitor and Nazi-collaborator. However,
thinking that a patient’s life must be saved, who so ever he or she may
be, put her on dialysis. She was in coma when admitted to the hospital.
After many hours of dialysis, Kolff bent over her and asked if she could
hear him. She slowly opened her eyes and said, “I am going to divorce
my husband!” She recovered fully, divorced her husband in due course
and lived another seven years.
After the end of World War II, Kolff sent artificial kidney machines
to England, Canada, and the United States. Soon the artificial kidney
came to be used worldwide but it could be used only in cases with acute
20th Century Medicine | 295

renal failure, in which the kidneys shut down for a few days and then
gradually start functioning normally. The use of artificial kidney involves
the insertion of tubes into an artery and a vein at the wrist. The arterial
blood is passed through the dialysis machine and returned back to the
patient’s vein. With repetition, the blood vessels became clogged, and a
new set of blood vessels have to be cannulated. Within a few days, no
vessel was available for cannulation, and artificial kidney could no more
be used. Therefore, patients with chronic renal failure, who need life-
long dialysis, could not be saved. This problem was solved by Belding
Scribner, an American Professor of medicine, in 1960. Scribner came
out with a U-shaped shunt, now known as Scribner Shunt, which could
be permanently inserted into an adjacent artery and a vein of the patient.
Thus, it was not necessary to make a new incision, each time the patient
underwent dialysis. The device was made of Teflon, so that the blood
did not clot in the shunt. With the use of the shunt, patients with
endstage kidney failure could be kept alive for years. The prognosis for
such patients changed from 90% fatal to 90% survival. Soon the medical
fraternity was faced with another dilemma. The patients with chronic
renal failure, who could be benefited from dialysis, outnumbered the
available dialysis machines. In the United States, bioethical committee
was constituted to decide who should receive dialysis; in other words,
who should live and who should die! To solve the problem, Scribner
soon came out with a small portable dialysis machine for use in the
home of the patient. This device helped the patients to leave the hospital
and lead a near normal ambulatory life. The development of kidney
transplant technology has not decreased the importance of the
hemodialysis. Patients with chronic end stage renal failure have to be
kept alive by regular hemodialysis till a suitable donor can be found for
renal transplantation.
296 | History of Medicine

HISTORY OF ORGAN TRANSPLANTATION


Organ transplantation was imagined centuries before it became really
feasible. The Chinese physician Pien Chiao is reported to have exchanged
hearts between a man of strong spirit but weak will with a man with
weak spirit but a strong will, in an attempt to a achieve a balance in each
man. According to Roman Catholic mythology, the third century saints
Damian and Cosmas replaced the gangrenous leg of a Roman priest with
the leg of a recently deceased Ethiopian slave. More likely accounts
refer to skin transplantation. The Indian surgeon Sushruta has described
in a book the technique of autografted skin transplantation in nose
reconstruction (rhinoplasty). The first successful corneal transplant
was performed by an Austrian surgeon Eduard Zirm in 1905.
Bombing of the cities during the World Wars I and II caused a
marked increase in the burn victims in whom the burnt surface was so
large that a skin autograft was not feasible. The application of a skin
graft from another human, the homograft, was followed by a high rate of
graft rejection. Therefore the War Wounds Committee of the British
Medical Council assigned a young Oxford-educated zoologist, Peter
Medawar, to investigate the problem of graft rejection and how to
circumvent it. Medawar noticed the heavy collection of lymphocytes in
the area of graft rejection in the host. Moreover, he found that the
second graft from the same donor was rejected more quickly than the
first. In 1951, Medawar reported that the graft rejection was an immune
phenomenon and lymphocytes play an important role in it. He suggested
that graft rejection can possibly be prevented by use of an
immunosuppressant drug. Cortisone had recently been discovered. It
was the first immunosuppressant drug to be tried. The discovery of
HLA antigen by Jean Dausset in 1958 gave further boost to the attempts
in organ transplantation. Kidney transplants were found to be highly
successful from a HLA-identical sibling donor or from another
20th Century Medicine | 297

HLA-matched individual. By 1970 the most effective immuno-


suppressant drug, cyclosporine had been discovered. Within two decades,
it was possible to transplant kidney, heart, liver, lung, etc.
The donor for a transplant surgery is usually a close relative or a
family friend with strong emotional attachment with the patient. Some
times it is purely an altruistic gesture. Over half of the members of the
Jesus Christians, an Australian religious group, have donated kidneys to
total strangers. Deceased donors are those donors who have been declared
brain-dead and whose organs are kept viable by mechanical resuscitators,
until they can be excised for transplantation. Young adult victims of
fatal automobile accidents are found most suitable. In China, a notable
category is that of executed prisoners. In India, often kidneys are
“donated” for commercial reasons—both legal and illegal. Poverty has
led many males to donate a kidney in exchange of a fairly large sum of
money. Others have been duped into an unnecessary operation during
which a kidney is removed without the knowledge of the victim. One
reason for these unethical activities is a great discrepancy between the
supply and the demand of the organs. Secondly, the price difference in
various countries for the organs and the transplant surgery has resulted
in “transplantation tourism.” In different countries, the cost of a kidney
may vary from 1000 to 20,000 dollars.
Face transplant is the latest success story in this field. In December,
2005, a French surgeon Dr Bernard Devanchell performed a partial
face transplant on a woman who face was mutilated by a ferocious dog.
Her nose, lips and chin had been bitten off and she could not speak or eat
properly. Due to her disfigurement, she was afraid of going outdoors.
During the surgery, the damaged areas like nose, lips and chin, were
replaced with the tissues from a recently dead female donor. By December,
2007, her face had recovered to the extent that she had regained the
normal skin sensations and control of facial muscles. In a crowd, no one
noticed any difference from a normal face.
298 | History of Medicine

JOSEPH E MURRAY
Joseph E Murray (born 1919) (Fig. 5.57) is
an American plastic surgeon who performed
the first renal transplantation in 1954. He was
awarded Nobel Prize in Medicine, 1990,
for this work.
After receiving his medical degree, he was
commissioned in the US Army Medical Corps
in 1944. He served as a plastic surgeon under
James Barrett Brown and Bradford Cannon.
During the next two years, he performed over
1,800 plastic surgeries on American soldiers
coming back from the battlefields of World Fig. 5.57: Murray
War II with extensive burn injuries. Typical procedure involved grafting
skin from a healthy area of the body to the burned areas. Such a graft, an
autograft, was always successful. Problem arose when the patients had
such an extensive burns that there was not enough of healthy skin to be
grafted. In such patients, skin from another healthy donor was used as
a graft. However, such foreign grafts, allografts, from other individuals
were always rejected. Murray’s mentor Barrett Brown had studied the
problem of graft rejection in 1930s and had discovered that allografts
that were successful were only those between identical twins. That
experience was to become the stepping stone to the first renal transplant.
After his discharge from the US army, Murray was interested in
specializing in plastic surgery. This discipline of surgery was very young
in the late 1940s, and therefore Murray was advised to specialize in
general surgery. He joined a group of surgeons looking for a method for
renal transplantation in cases with endstage chronic renal failure. Many
years of experimental research in dogs led Murray to a technique to
transplant the donor kidney in the lower abdomen, connecting the renal
20th Century Medicine | 299

artery and vein to the respective internal iliac vessels and connecting the
ureter to the bladder. However, same method would succeed or not in
humans was not known to Murray. The opportunity to test his technique
came in 1954 when a young man was admitted who was suffering from
chronic renal failure. He had an identical twin brother, who was willing
to donate his kidney to save his life. In spite of the offer, Murray and his
team were confronted with the moral problem of removing an organ
from a healthy person. They sought clearance from the clergy as well as
the Massachusetts Supreme Court before undertaking the procedure.
Extensive testing was carried out, including a successful skin graft from
the healthy brother to the patient as well as the finger printing of the
brothers at the local police station. The latter test came to the knowledge
of the press, who sought daily progress of the contemplated operation.
On December 23, 1954, the surgery began in two adjacent operation
rooms. In one the recipient was prepared for the transplant site, while
the donor’s kidney was removed in the other. The final vascular
anastomoses of the transplanted kidney took about an hour and a half.
The surgeons were not sure how much anoxic damage the kidney might
have suffered during this period of bloodlessness. As the clamps were
removed, there was a hush in the room followed by grins as the donor
kidney turned pink and the urine began to flow briskly. Thus the very
first kidney transplantation was a success. The patient recovered fully
and married the nurse who looked after him in the recovery room after
the operation. They had two children, but the patient died in 1962 from
a recurrence of the original disease in the transplanted kidney. The other
brother with only one kidney is still alive and leading a normal healthy
life.
Murray and his team continued to perform renal transplants but at
a slow rate. The difficulty was finding a suitable donor. The obvious
answer was the need of a method to suppress the phenomenon of graft
rejection. The advances in pharmaceutical research by Elion and Hitchings
300 | History of Medicine

in 1950s provided the necessary immunosuppressive drugs. Over the


years, kidney transplantation has become the most common organ
transplant with very high success rate.

WILDER PENFIELD
Wilder Penfield (1891-1976) (Fig. 5.58) was
a Canadian neurosurgeon known for the
discovery of temporal lobes (hippocampus)
as the seat of memory. During his life, he was
called “the greatest living Canadian.”
Penfield was trained in neurology in
Oxford and in Spain, Germany, and New York,
before becoming the first neurosurgeon in
Montreal (Canada). He was chiefly responsi-
ble for the establishment of the Montreal
Neurological Institute in 1934, where
surgeons, physiologists, research scientists Fig. 5.58: Penfield
in neurology worked as a team to extend the knowledge in the neurological
diseases.
In 1950s, Penfield was trying a surgical treatment of intractable
epilepsy. Before an epileptic seizure, the patient usually feels an ‘aura’,
a group of sensations, a warning that the seizure is about to occur.
Penfield thought that he could localize the focus of epilepsy if electrical
stimulation of an area of cerebral cortex could produce an aura. Therefore,
in a conscious patient, he would open the skull under local anesthesia
and electrically stimulate various parts of cerebral cortex. His technique
was often successful; excision of that part of the brain stopped epileptic
seizures. During these studies, Penfield made some more significant
discoveries in neurophysiology.
20th Century Medicine | 301

Penfield found that


electric stimulation of
precentral gyrus of the
brain resulted in motor
activity in the contra-
lateral side of the body.
Further the motor area
devoted to some parts of
the body like the hand,
face, tongue was far
greater than that devoted
to say trunk and limbs.
The map so created was
called motor homunculus Fig. 5.59: Motor homunculus
(Fig. 5.59). Still more
important was the discovery that stimulation of the temporal lobe
produced memory of long forgotten events, like sounds, movement,
color or even smell. If Penfield stimulated the same area in the temporal
lobe again, exactly the same memory cropped up. Thus it came to be
known for the first time that temporal lobes are the seat of memory.

ERIC RICHARD KANDEL


Eric Richard Kandel (born 1929) (Fig. 5.60) is an American psychiatrist
and a neuroscientist. He is the recipient of Nobel Prize in Medicine,
2000 for the discovery of physiological basis of memory storage in
neurons. He shared the prize with Arvid Carlsson and Paul Greengard.
The world of neuroscience was first opened up to Kandel through
his interaction with a college friend whose parents were Freudian
psychoanalysts. Freud’s concept of conscious and unconscious memories
302 | History of Medicine

fascinated him so much that just after


graduation in medicine; he started
research work on memory. By that
time, Wilder Penfield had shown that
the memory was stored in the
hippocampus (temporal lobes of the
brain), but the cellular basis of memory
storage was still not known. Kandel
began to realize that the complex
cellular architecture of hippocampus
was not suitable model to study the Fig. 5.60: Kandel
cellular basis of memory. Instead, he
chose an invertebrate marine mollusk Aplysia californica. His supervisor
and all the colleagues were shocked at his choice of the animal model.
“What can you possibly learn in an invertebrate about such a complex
phenomenon as memory?” they asked. Undeterred by the criticism,
Kandel continued his work. In the beginning, he demonstrated the simple
forms of memory such as habituation, sensitization, classical conditioning
and the operant conditioning. Subsequently he was able to identify the
proteins that had to be synthesized in order to convert short-term
memories into long-lasting memories.

NITRIC OXIDE: FROM MENACE


TO MARVEL OF THE DECADE
Nitric oxide (NO) is a very unstable free radical that was known to be
a common air pollutant formed when nitrogen burns, such as in
automobile exhaust fumes. Now it is known to be present in all the
tissues of the human body, including the nervous system. The discovery
of its existence in the human body and its physiological functions was
the result of researches for over two decades by three American scientists,
20th Century Medicine | 303

Robert Furchgott (Fig. 5.61), Louis Ignarro


and Ferid Murad and many others. All the
three were awarded Nobel Prize in Medicine
in 1998.
The story begins in 1980s, when
Furchgott was experimenting on the effects
of acetylcholine on the arterial smooth muscle.
In one of the experiments, Furchgott failed to
observe the expected vasodilatation. As
compared to the other preparations, he found
no difference in the smooth muscle, but this
preparation of the artery had lost the
innermost lining called the endothelium. “Was Fig. 5.61: Furchgott
endothelium essential for the vasodilator effect of acetylcholine?”
Furchgott thought. A new series of experiments proved that some
unknown agent released from the endothelium was responsible for
vasodilator effects of not only acetylcholine but also some other well-
known vasodilators. Furchgott named the mysterious agent as
endothelium derived relaxing factor (EDRF). Curiosity provoked several
laboratories to try to identify the chemical nature of EDRF. In 1986,
Ignarro presented a paper in a conference showing that the EDRF was
chemically nitric oxide (NO). Unknown to him, Furchgott reached the
same conclusion independently and presented his conclusion in the
same conference. These reports were followed by a flood of studies on
NO. Almost every week, a new site of the presence of NO was reported.
Many physiological and pathophysiological roles of NO were discovered.
Most surprisingly, NO was found to be present as a neurotransmitter in
various parts of the brain. In the mean time, another scientist, Ferid
Murad was able to demonstrate the biochemical pathway responsible
for the generation of NO in the endothelial cells.
While studying the relaxant effect of NO on the vascular and
nonvascular smooth muscle from the penile erectile tissue, Ignarro realized
304 | History of Medicine

that the naturally occurring physiological neurotransmitter involved in


erectile response was still not known. It was known the neurons
supplying the penile erectile tissue were neither adrenergic nor
cholinergic. Could the neurotransmitter in the erectile tissue be NO?
Electric stimulation of strips of rabbit penile tissue was found to produce
marked muscle relaxation which could be blocked by addition of a nitric
oxide synthase-inhibitor to the organ bath fluid. Later he could directly
demonstrate the release of NO in the intact penis in a rabbit, by electric
stimulation of its nerve fibers.
During the last decade, more than 3000 papers on NO research have
been published per year. A journal devoted entirely to the research on
NO, named Nitric Oxide has started publication. Researchers were
surprised that a gas, NO, could act as a neurotransmitter in the CNS. In
fact it has been shown that, of all the body tissues, the brain contains the
enzyme nitric oxide synthase in highest concentration. One of the
mechanisms of bacterial killing by a macrophage also involves the release
of the toxic molecule NO.

TEST TUBE BABIES


(IN VITRO FERTILIZATION)

In vitro fertilization (IVF) is the technique in which ova obtained from


a woman are fertilized with the sperms of the husband (or another
donor, if the husband has very low sperm count) in a laboratory. The
fertilized ovum is implanted into the uterus of the mother and pregnancy
is allowed to continue as usual. The first child to be born in this manner
was called a “test tube baby” by the lay press.
The first test tube baby, named, Louise Brown, was born in July
1978. The technique was developed over a period of nine years by a
British Gynecologist Patrick Steptoe (Fig. 5.62) and his partner Robert
Edwards a physiologist. Her birth marked a medical achievement that
20th Century Medicine | 305

changed the lives of hundreds of thousands


of childless couples. By the year 2000,
more than 300,000 women around the
world had conceived by IVF. These days,
in some countries around 4% of the babies
are born through IVF. It has been estimated
that in the year 2004, 1.5 million children
were born worldwide with IVF technology.
Nowadays, you can buy sperms from the
sperm banks, buy human ova from a donor
and have a surrogate female for the
gestation “(rent a womb)” of the fertilized
ovum. Thus, one can become a mother of Fig. 5.62: Steptoe
a baby without undergoing all the
emotional and physical hassles of 9 months of pregnancy! The Indian
rent-a-womb service has become so popular amongst the foreigners
that it is touted as an important component of medical tourism.
Within two months of the birth of the first test tube baby in England,
an Indian doctor Subhash Mukhopadhyay announced the birth of a
test tube baby “Durga” in Calcutta. The news was found too far fetched
to be believed by the medical fraternity. An enquiry was instituted by
the West Bengal Government. It is said that the enquiry committee
could not find the sophisticated infrastructure necessary of the IVF
technique. The West Bengal Government not only transferred Dr
Mukhopadhyay out of Calcutta, but also did not allow him to go to
Tokyo, where he was invited to present his paper on the IVF technique
used by him. Facing social ostracization, ridicule and reprimand,
Mukhopadhyay committed suicide in 1980. Only in 2005, the Indian
Council of Medical Research and the international scientists recognized
the claim of Dr Mukhopadhyay and recognized India as the country
where the world’s second test tube baby was born.
306 | History of Medicine

BAREFOOT DOCTORS OF CHINA


Barefoot doctor was the name given to the members of a peasant
medical force assigned to eradicate infectious diseases in China in 1958.
In those days, infectious diseases like smallpox, polio, diphtheria,
whooping Cough and schistosomiasis were highly prevalent in the rural
China. The Chinese Government had neither funds nor medical manpower
to launch a massive medical program. Faced with the practical difficulty,
the government gave elementary medical training course of three to six
months to thousands of peasants. These health workers, mostly men
and women in their 20s already had some basic education. They were
given working knowledge in anatomy, bacteriology, diagnosing simple
ailments, and prescribing traditional and
western medicines, birth control and organizing
sanitation campaigns. These health workers
continued with their traditional occupation,
farming. More often they were barefooted,
hence the name (Fig. 5.63). Within 10 years,
China had one million barefoot doctors. These
barefoot doctors gave the poor farmers basic
lessons in personal hygiene, and carried out a
large scale immunization program. As a result
of their efforts, the incidence of many infec-
tious diseases, especially schistosomiasis,
claimed to have been reduced dramatically. In Fig. 5.63: Barefoot
1970s, WHO was so much impressed with the doctor and patient
barefoot doctors’ work that it recommended it as a model for other
underdeveloped countries. However, by 1990s the barefoot doctors
program was abandoned in China. Later, it came to be known that the
success story of barefoot doctors was more of propaganda by the
communist regime. The World Health Organization recently ranked China
as fourth-worst out of 190 countries for quality of health care.
20th Century Medicine | 307

V RAMALINGASWAMI
Vulimiri Ramalingaswami (1921-
2001) (Fig. 5.64) was one of the most
eminent medical scientists in India. He
was trained in medicine but very early in
his career, he switched over to pathology
and became a research officer in the
Nutritional Research Laboratories,
Coornoor (Tamil Nadu), the forerunner
of the National Institute of Nutrition,
Hyderabad set up in 1947. From this
laboratory in Coornoor, he was initially
sent to Laboratories of Armed Forces
(forerunner of AFMC, Poona) to work
under Col. Leo Krainer, an Austrian
neuropathologist. Subsequently, he was Fig. 5.64: Ramalingaswami
trained in nutritional pathology in Oxford, where he was awarded M.Phil.
in 1951 for his excellent experimental work on the role of essential fatty
acids in the causation of phrynoderma, a skin disorder. Back home,
Ramalingaswami continued to work on a modest post of Research Officer.
During this period he made very important discoveries on protein-
caloric malnutrition, a highly prevalent nutritional deficiency disorder in
India of those times. He also made extensive studies on Himalayan
goiter as well as vitamin A and D deficiency disorders. When AIIMS
was founded, Ramalingaswami appointed Professor and Head of
Pathology. He remained on this post for 10 years from 1969-1979.
During this period, he invited many world-famous pathologists to his
department and helped in the development of close association of AIIMS
with other top-class research centers of the world. In 1978, he was
appointed Director General of ICMR, New Delhi; the post he occupied
till retirement.
308 | History of Medicine

Ramalingaswami received numerous national and international


awards, including a Padma Bhushan, the highest civilian award of the
Government of India. It is to the credit of Dr Ramalingaswami and his
colleagues at National Institute of Nutrition, Hyderabad that these
scientists addressed the problems prevalent in India and made significant
contributions to the improvement in health care of the local population.
This cannot be said of most of other high-ranking Indian postgraduate
and research institutes who have been happy duplicating research “in
fashion” in the Western World at that time.

AS PAINTAL
Autar Singh Paintal (1925-2004) (Fig. 5.65)
was one of India’s greatest scientists of 20th
century. He is remembered for his numerous
discoveries in respiratory physiology. He was
not only a towering figure in the field of
physiology, but also a colorful and uncompro-
mising personality in Indian science. During
his life time, Paintal was honored with a large
number of awards, including the Padma
Vibhushan in 1986.
Paintal studied medicine in King George’s
Medical College, Lucknow. His brilliance was Fig. 5.65: Paintal
evident even during the undergraduate career. He was awarded a large
number of medals, including the best graduate of the class. When reminded
of the medals once, he replied: “I wasted a lot of time trying to stand
first, while I should have been reading magazines like Scientific American
which contain solid science.” After a MD in Physiology from the same
institute, Paintal proceeded to the UK on a fellowship. There, he was
introduced to respiratory physiology when he joined professor
20th Century Medicine | 309

Whitteridge, a famous respiratory physiologist in Edinburgh for a PhD


program. At the outset, he impressed his mentor by developing a
technique to study single nerve fiber conduction velocities. He was the
first to demonstrate that the atria of the heart contained type-B receptors
that are involved in the regulation of blood volume. This was contradictory
to the views of his mentor, who believed that the volume receptors were
located in the great veins or pulmonary veins. At that time, Paintal also
demonstrated the stretch receptors in the stomach responsible for the
feeling of satiation after food intake. After a brief and uneventful stay in
the All India Institute of Medical Sciences, New Delhi, Paintal was
practically forced out in 1955. He became the Director of Patel Chest
Hospital, New Delhi, a post he held till 1990, when he was appointed
Director General of the Indian Council of Medical Research (ICMR). In
the Patel Chest hospital, Paintal worked in two-roomed laboratory and
produced world-class research work. Paintal is best remembered for the
discovery of juxtacapillary pulmonary receptors, that he named
J-receptors. He demonstrated the role of J-receptors in various physio-
logical and pathological states. He made so many contributions to the
visceral physiology that some eminent British physiologists coined the
term “Pre-Paintal versus Post-Paintal era” in sensory physiology.
Paintal was a staunch believer that research should be conducted
and presented in most ethical manner and not used as a show-business.
He refused to attend inauguration of a medical conference held in a hotel.
He founded The Society of Scientific Values, the first of its kind in the
world, to promote integrity, objectivity and ethical values in scientific
research. He also believed in defending one’s research findings “( jung
ladna seekho: learn to fight battles).”
As the Director General of ICMR, Paintal courted world-wide
controversy, when in 1990, he urged the Government of India to ban
Indians from having sex with foreigners. “This is a totally foreign disease,
and the only way to stop its spread is to stop sexual contact between
310 | History of Medicine

Indians and foreigners”. The suggestion was met with outrage, with the
critics asserting that such a law would violate human dignity. Only 15
years later, India has the dubious distinction of having the largest numbers
of HIV-positive patients.

INSTRUMENTAL ERROR LEADS TO


AN IMPORTANT DISCOVERY
Dr BK Anand (born 1917) is a pioneer in neurophysiology in India.
After obtaining a Postgraduate degree in physiology, he proceeded to
the United States in 1950 to work in the laboratory of a famous American
neurophysiologist, Professor John Fulton under a Rockefeller Foundation
Scholarship. At that time, Anand did not touch alcohol or tobacco. To
get along well with Fulton, Anand was advised to start drinking since the
mentor used to discuss the research projects with his students only over
a drink. Eager to learn neurophysiologic techniques, Anand lost no time
in taking up the advice.
Dr Anand was assigned to learn stereotaxic technique from Prof
John Brobeck, by repeating a procedure already established in the
department to destroy the satiety center in the hypothalamus of rats.
After the procedure, the rats were expected to become glutton and
obese. To Anand’s dismay all the six rats operated by him stopped
eating and died of starvation. Dr Brobeck told Anand that he had not
learnt the technique properly. On the other hand, Anand, confident of
the precision of method followed by him, proposed that there might be
two centers in the hypothalamus, a satiety center, destruction of which
produced obesity, and a feeding center, destruction of which led to
inhibition of food intake. When the brains of the experimental animals
were examined under a microscope, in each rat, a giant lesion was found
in the hypothalamus. Further investigation revealed the basic cause:
The lesion maker was giving a current of 20 milliamperes, when it was
20th Century Medicine | 311

set to deliver only 2 milliamperes. Further studies led to the localization


of two centers in the hypothalamus, discrete lesion of one led to obesity
where as starvation resulted from the destruction of the other. This
discovery made Anand a famous neurophysiologist before the end of his
fellowship program. Dr Anand was offered a faculty position in Dr
Fulton’s laboratory, but patriotic fervor compelled him to come back to
India in 1951. He became head of the department of Physiology in the
All India Institute of Medical sciences, New Delhi, and along with team
of colleagues, continued to make notable researches in neurophysiology.

ALEXIS CARREL
Alexis Carrel (1873-1944) (Fig. 5.66) was
a French surgeon, biologist and eugenicist,
who was awarded Nobel Prize in
Medicine, 1912 for his work on suturing
of blood vessels as well as his work on
transplantation of organs in animals.
In 1894, when Carrel was still a medical
student, the president of France bled to
death after being wounded by an assassin.
If doctors had known the art of repairing
the ruptured blood vessels, the life of the
president could have been saved. This idea
captured the imagination of Carrel to such Fig. 5.66: Carrel
an extent that he started experimenting
while still a student. Carrel graduated in 1900, and one year later published
a procedure for the repair of blood vessels in a French medical journal.
Some how, Carrel was not liked by the French establishment. In spite of
his aptitude for research, Carrel was not given a faculty position in any
university in France. In 1906, he got an opening in the USA in the newly
312 | History of Medicine

opened Rockefeller Institute for Medical Research, an institute entirely


devoted to research. Carrel remained in that institute till 1939. Here he
continued to improve his methods for blood vessel surgery. In 1910, he
published a method to transfuse blood directly from one person to
another (anticoagulants were yet to be discovered). Carrel also developed
the art of tissue culture. In 1912, Carrel took out the heart of a chicken
embryo and kept it alive for the next 34 years. The tissue was deliberated
terminated after his death.
Alexis Carrel is now more infamous for his strong support to the
eugenic movement of 1930s. He advocated the use of gas chambers to
rid humanity of “inferior stock”, meaning a criminal, insane, cheat or
just some one not of high family background. He was a big supporter of
the Nazis’ program of ethnic cleansing for the purpose of eugenics.

THE EUGENIC MOVEMENT


Eugenics was a social philosophy which advocated the improvement
of human hereditary traits through various forms of intervention varying
from forced sterilization to killing of an individual perceived as from an
inferior stock.
Selection of human stock was suggested as far back as Plato, who
believed that the human reproduction should be controlled by the
government. “The best men must have intercourse with the best women
as frequently as possible, and the opposite is true of the very inferior,”
wrote Plato in his book The Republic. In the ancient city of Sparta, all
new- born infants were left outside the city gates for a certain length of
time. The survivors were considered stronger enough and deserved to
live, while the weaklings were eliminated.
During 1870s, Sir Francis Galton, a British biologist reinterpreted
Darwin’s theory of the origin of Species. According to Galton, “genius”
and “talent” are hereditary human traits and selective human reproduction
20th Century Medicine | 313

could be used to improve the human stock. He coined the word eugenics,
derived from Greek meaning good generation, or the science of heredity
and good breeding. Eugenics eventually referred to a movement for the
selective human reproduction with intent to create children with desirable
traits, generally through the approach of influencing the differential
birth rates. These policies were mostly divided into two categories:
Positive eugenics, the increased reproduction of those seen to have
advantageous hereditary traits and negative eugenics, the discouragement
of reproduction by those with hereditary traits perceived as poor.
By 1920s, the USA and Germany were in the forefront of eugenics
movements. Nazi Germany under Adolph Hitler was the biggest
supporter of eugenics, and attempted to maintain a “pure” German race,
through a series of programs which ran under the banner of “racial
hygiene.” The Nazis performed extensive experiments on human young
adults to test their genetic theories. During 1930s and 1940s, Nazis
forcibly sterilized hundreds of thousands of people whom they viewed
as mentally or physically “unfit.” Between 1934 and 1945, as many as
400,000 men and women were sterilized. Nazis went further and killed
thousands of institutionalized disabled persons through compulsory
“euthanasia,” euphemism for murder. Millions of other ‘undesirable’
people like Jews, gypsies, and homosexuals were eliminated in the gas
chambers. Germans also implemented a number of “positive eugenics
policies.” Under this program racially pure and beautiful single women
were to be impregnated only by the SS officers (German secret service
agents).
The second most popular eugenics movement was in the USA.
Starting from 1896, many states enacted marriage laws with eugenics
criteria. Any one who was “epileptic, imbecile or feeble-minded” was
not allowed to marry. Charles B. Davenport, a prominent American
biologist, after years of “research,” in the first decade of 20th century,
came to the conclusion that all those who come from poor economically
314 | History of Medicine

or social backgrounds were unfit for further reproduction. In 1924, the


Immigration Act put a ban on the immigration of people from Eastern
and Southern Europe and Asia, to control the number of “unfits” from
entering the country. Under the eugenics program, between 1903 and
1963, over 64,000 individuals were forcibly sterilized in the USA.
Actually, most of the Western nations adopted the eugenics program
with gusto. Canada committed thousands of forced sterilizations and
the program lasted as late as 1970. Native Canadians and immigrants
from Eastern Europe were specially targeted. Sweden forcible sterilized
62,000 “unfits,” consisting of those mentally ill or those belonging to
ethnic or racial minority groups. Other countries like UK, Australia,
Norway, France, Finland, Denmark and Switzerland did not lag behind
in their efforts to improve the racial (Caucasian) stock.
The ghost of eugenics resurfaced in1980, when Robert Graham, an
American millionaire started the Genius Sperm Bank. The aim was to
collect sperms from the most intelligent men like Nobel Laureates to
impregnate American women so that the next generation of Americans is
extraordinarily brilliant. He managed to collect semen from only three
Nobel Prize winners. Then he started visiting various universities to
find the brightest young adults males for the collection of genius semen.
The semen was stored in a sperm bank and given to any willing couple
at a nominal cost. The aim was not to make money, but to improve the
American race. Unfortunately, the supply of genius semen could not
keep pace with the demand. Using the semen from the Genius Sperm
Bank, 230 children were conceived. Since the identity of these children
was not known, the true extent of the success remains a mystery.
However, one such child, Doron Blake became famous, because of the
countless interviews given by his mother. According to his mother, the
child started using computer at the age of two (!). He was reading
Shakespeare and learnt algebra in the nursery(!). At the age of six, the
child had a certified I.Q. of 180 (!). Few others who have revealed their
20th Century Medicine | 315

identities have been found to be equally talented. But the real picture is
still hazy. First, the child inherits same number of genes from the mother
also. Would she have to be superintelligent in order to achieve the best
results? Secondly, the role of the environmental factors cannot be
eliminated. In any case, Robert Graham died in 1997, and within two
years of his death, the Genius Sperm Bank folded up. Even those who
genuinely believed in eugenics were not ready to say so publicly, for fear
of human rights organizations.

HISTORY OF ALCOHOL
Alcohol has been used since ancient times in almost all civilizations.
Alcohol was used for medicinal, social, religious, or recreational purposes.
Earlier, only fermented wines were in use, but with the discovery of the
process of distillation, in 15th century, alcohol consumption rose rapidly
throughout the world. Moreover, stronger varieties of alcohol like whisky
and vodka were made available. It soon became a most-profitable business
proposition. Alcohol intake rapidly spread throughout the Western
culture.
Hindu mythological descriptions of the time of 2000 BC present
accounts of the consumption of soma or somaras by higher classes of
the society and the gods, for its tranquilizing and euphoric effects.
However, by and large, consumption of alcohol as well as nonvegetarian
food was looked down upon. Moughal rulers appeared in India in around
700 AD. With the religious edicts from Islam, use of alcohol was
completely banned, but the ruling classes were heavy drinkers.
In 1920, sale of alcohol was banned throughout the USA. But, there
was no let up in the drinking habits of the Americans. Those who
wanted could get it easily, either from bootleggers or by a medical
prescription. Such persons included President Harding of the USA,
316 | History of Medicine

who had voted for prohibition as a Senator. In his time the White House
was always well-stocked with bootleg liquor. With the alcohol production
in the hands of criminals and home manufacturers, the quality of alcohol
deteriorated. Adulteration with industrial alcohol and other toxic chemicals
led to blindness as well as brain damage and deaths. Since whisky could
be obtained on a medical prescription, the sale of alcohol for “strictly
medicinal purposes” reached all time high. Millions of gallons of alcohol
were dispensed on medical prescriptions every year. The only benefit
of prohibition was that the gangsters and corrupt enforcement agencies
became enormously rich, whereas the government not only lost lot of
revenue but also spent millions of dollars in trying to enforce prohibition.
The widespread corruption created a law and order problem. Ultimately,
prohibition was repealed throughout the United States in 1933.
The British rule in India brought in India a distinct increase in the
consumption of alcohol. Its use was encouraged by the distilleries owned
by the British. More important, the social groups of Indians that wanted
to be close to the masters considered drinking as an essential part of
social interaction. When India became free, prohibition was one of the
directive principles of the Indian Constitution. Total prohibition is being
followed in Gujarat, since 1960, and for brief periods in Haryana, Andhra
Pradesh and Nagaland. In all these states, the effects of prohibition have
not been different from those seen the prohibition era in the USA (1920
–1933).
At present the consumption of alcoholic drinks is on the rise all
over India. According to a WHO survey in 2004, there has been a 300
per cent increase in consumption of hard drinks during the last 40 years.
Even females are becoming fond of alcohol. According to an estimate the
consumption of alcohol among the females has increased four-folds
during the last 10 years.
20th Century Medicine | 317

TRUTH SERUM
Truth serum is a double misnomer. It is neither a serum nor does its
administration ensures truthful statements from the subjects. A truth
serum is actually a drug used on a prisoner with the hope of obtaining
accurate information, most often by a police or military organizations.
The problem of working out who is telling the truth and who is
lying, especially among enemy spies and criminals, is as old as
civilization. Alcohol is the most ancient truth serum and still in use
today. A Roman proverb states: “In wine there is truth.”
In medieval period, the truth was tested by ordeals of fire or
water. Someone suspected of lying would have to carry a red-hot iron
bar for nine steps. If he was truthful, God was expected to save him
from injury. As can be expected, each suspect used to get his hands
burnt and was then promptly hanged. In some courts, the accused
person was put into a sac and thrown into a pond. An innocent person
was expected to float in water.
Calabar beans have been used as an ancient truth drug in Africa,
particularly Nigeria. Calabar beans were administered to persons accused
of witchcraft or other crimes. If the person survived, it was taken as a
sign of being innocent. Actually, calabar beans are tasteless but highly
poisonous. The active ingredient is physostigmine. When the beans
were given to the suspect, the innocent person would not be afraid and
swallow the beans all at once. The result was severe gastric irritation and
vomiting of the seeds. A guilty person would swallow the beans half-
heartedly and slowly. Vomiting did not ensue and the person would die
of physostigmine poisoning.
Early in the 20th century, obstetricians began to use scopolamine
along with chloroform to induce a state of “twilight sleep” during
childbirth. It was also noticed that women in twilight sleep answered
questions accurately and often volunteered exceedingly candid remarks.
318 | History of Medicine

In 1922, an American Obstetrician, Robert House, felt that the drug


might possibly be useful in the interrogation of suspected criminals. His
interrogation of three criminals and subsequent results of the police
investigations seemed to validate House’s contention. He concluded
that under the effect of scopolamine, a subject “cannot lie—and there is
no power to think or reason.” His experiment and the conclusion attracted
worldwide attention. The phrase “truth serum” was used for the first
time in a news report on the experiment in 1922, and the phrase stuck
for ever. In the next ten years, House published about eleven articles on
the subject, claiming a grossly exaggerated success rate. He came to be
known as “father of truth serum.” Actually, it seems scopolamine was
not often used by the police for extracting truth from an unwilling
witness because of its serious side effects. Scopolamine produced
hallucinations, disturbed perception, somnolence and physiological
phenomena such as tachycardia, blurred vision and headache. The subject
was too disturbed to give any coherent answers to the questions asked.
Within a few years, the use of scopolamine as truth serum was abandoned.
By 1930, barbiturates, a class of hypnotics and sedatives, were
found therapeutically useful for the production of sedation and even
anesthesia. Some of them with ultrashort action have been used as a
truth serum and the method has been called narcoanalysis. When
administered intravenously, they produce all the stages resembling
progressive drunkenness, ultimately ending in unconsciousness. As the
effect of the drug wears off, the subject goes into a state of semi-
wakefulness and disorientation. The inhibitory centers of the brain are
especially depressed. In this state of mind, the subject is likely to come
out with many facts which he is unlikely to reveal in normal
consciousness. This method of narcoanalysis continues to be used till
date.
Since 1942, the American intelligence agencies have been on the look
out for a more reliable truth serum, a chemical that could breakdown the
20th Century Medicine | 319

psychological defenses of enemy spies and prisoners of war. The aim


was to make it easier to obtain information without physical torture,
otherwise resorted to. In early 1950, the American Intelligence Agency,
CIA started the trial of the most mind-bending drug, called LSD. The
drug produced such bizarre hallucinations that the subject preferred to
tell what he was expected to admit rather than undergo another dose of
medication. Not lagging behind, MI6, the British intelligence agency
also started investigating the use of LSD as truth serum. In search for the
truth serum, the American and the British volunteer subjects were not
told the truth. They were informed that they were participating in a
medical project to find a cure for common cold. Many of the volunteers
became insane and some committed suicide. In 2006, MI6 was forced
by the courts to pay financial compensation to the former test subjects.
In any case, by 1960, both CIA and MI6 came to the conclusion that
LSD is useless as a truth serum.
A spin-off of the research on LSD was its use as recreational drug.
It started with the use LSD by the research workers themselves and on
their friends. It began to be advocated as a means of spiritual growth,
and enlightenment. Ultimately, the drug production and research of LSD
was banned in the USA and many other countries in 1967. But its illegal
production and sale has continued to this day. Its use was one of the
reasons of the spread of hippie culture in 1960s and 1970s.

RICHARD AXEL
Richard Axel (born 1946) is an American scientist who shared the
Nobel Prize in Medicine, 2004 with his colleague, Linda B Buck, for
their pioneering work in physiology of olfaction (Fig. 5.67).
Axel was born to a poor family that had emigrated from Poland to
the USA. His father was a tailor and mother was uneducated. Even from
320 | History of Medicine

Fig. 5.67: Axel and Buck

childhood, he had to start doing all sorts of odd part time jobs to support
the family. At the age of eleven, he was delivery boy for a dentist, at
twelve he was laying carpets, and at thirteen, he was serving corned beef
at a canteen. He continued to earn money from such jobs till he earned
his MD. His entry into a medical school was not because of any love for
the medical profession. This was the only way he could assure deferment
from the military service.
According to Axel, his undergraduate years in the John Hopkins
School of Medicine were “terrible.” He was pained by constant exposure
to the suffering of the ill. He could rarely hear a heart murmur, never saw
a retina, his glasses fell into an abdominal incision, finally, he sew a
surgeon’s finger to a patient while suturing an incision. His incompetence
and disinterest in medicine became clear to his teachers who urged the
dean to find a solution. He was given an MD degree with the undertaking
that he would not practice medicine on live patients. Thus, he became an
intern in Pathology where he was expected to perform autopsies on
cadavers. After a year in the department, the Chairman of Pathology
asked him never to practice even on dead patients. Thus ended his career
in medicine.
20th Century Medicine | 321

Axel’s next appointment was as a researcher in the Department of


Genetics. He liked the job so much that, contrary to the regulations, he
often left the laboratory only at midnight and was often issued parking
violation summonses. In the middle an important experiment, Axel was
arrested by the FBI for over 100 such unattended summonses. Even this
work on recombinant DNA was not without controversy. Because the
research work involved genetic engineering, he and his colleagues were
accused of playing “God.” When he named his first son “Adam,” it was
taken as another evidence of his desire to act as God. Only after many
years of endless debates, the public could understand the medical benefits
of genetic engineering.
Only in late 1980s, Axel became interested in the problem of sensory
perception in the brain. Together with a colleague, Linda Buck, Axel
discovered 1,000 odorant receptors genes in the rat genome, which
provided an answer to the diversity of odor recognition and published
the results in 1991. The work was awarded Nobel Prize in Medicine
in 2004.

WHEN PATIENTS REFUSED TO


TERMINATE A FAILED DRUG TRIAL
Pfizer, a famous pharmaceutical company, had developed a vasodilator
drug for use in the patients with angina pectoris. During the drug trial in
human subjects, it was found to be a useless since the patients did not
find any improvement in their symptoms. Therefore, it was decided to
terminate the drug trial. However, the men undergoing the drug trial
wanted to continue with the drug. On enquiry by the perplexed scientists,
it was revealed that the drug gave rise to strong and long-lasting penile
erection, and at their age it was a real bonus. This strange side effect was
not reported during the drug trial because the researchers neither expected
nor asked about the penile erections as a side effect of the drug. In any
322 | History of Medicine

case the effect of the drug, sildenafil, was confirmed on further drug
trials on patients with erectile dysfunction (impotence). The drug was
marketed in 1996 as Viagra. The name is said to be derived from a
Sanskrit word, “vyaghra”, meaning “tiger”.
Normally, when a new drug is launched, the drug manufacturers
spend millions on the publicity through presentations at major medical
conferences, seminars, symposia, and advertisement in medical journals.
Viagra came to be known all over the world without any of these efforts.
The print and electronic media spread the “great news” on their own.
Then, reports of deaths in people using Viagra started appearing. The
death was not a direct effect of any drug toxicity but due to the “stress”
of the long–forgotten “exercise.” Among the spouses, it was considered
a great nuisance, because the husbands demanded sex after decades of
sexual abstinence. Moreover, there were complaints of Viagra-using
husbands running wild after younger women. Psychiatrists advised the
men to warn their wives before starting the use of Viagra so as to prevent
the breakup of long-lasted marriage.

DISCOVERY OF HELICOBACTER PYLORI


Dr Barry Marshall (born 1951) and Dr Robin Warren (born 1937)
(Fig. 5.68) are Australian physicians who were awarded by Nobel Prize
in Medicine, 1905, for their discovery of Helicobacter pylori, the
bacteria responsible for the gastric disorders like peptic ulcer and gastritis.
Since ages, chronic gastritis and peptic ulcer have made people
suffer a lifetime of pain, distress and inconvenience. The condition was
considered intractable because it was believed to be caused by inherent
personality disorders like anxiety or aggressive behavior. The ultimate
surgical treatment like partial gastrectomy along with vagotomy
somewhat cured the symptoms but the patient’s life became more
20th Century Medicine | 323

Fig. 5.68: Warren and Marshall

miserable because of the effects of the operation. Alternatively, the


patients were advised to “learn to live with it.”
In 1979, Dr Robin Warren, a pathologist in Australia, discovered the
presence of small curved bacteria on a biopsy of gastric mucosa. During
the next two years, he confirmed the presence of the particular type of
bacteria in almost all the gastric mucosal biopsies from patients of
gastritis or peptic ulcer. His report was greeted with skepticism since, it
was argued that “no bacteria could survive the highly acidic environment
of the gastric mucosa.” To Dr Warren, it looked a dead end to the
research project. Luckily, in 1981, Dr Barry Marshall, a registrar in
gastroenterology department of the same hospital, approached Dr Warren
for a research project. The research project of Dr Warren got a big boost
since Dr Marshall could provide the clinical material for further research.
Dr Marshall was routinely doing endoscopy on all cases of upper
gastrointestinal disorders. Hence a gastric mucosal biopsy could easily
be taken in each patient. First of all, they reconfirmed the association of
324 | History of Medicine

the curved bacteria and gastritis. Next they cultured the bacteria and
named it Helicobacter pylori.
Numerous publications by the two workers failed to convince the
medical fraternity. To give a convincing reply, Dr Marshall swallowed a
culture of Helicobacter pylori. A week later, he developed acute gastritis
and the gastric biopsy confirmed the presence of Helicobacter pylori.
Following a course of antibiotics, both the gastritis as well as the bacteria
disappeared. From 1985 to 1987, all the patients of gastritis in Dr
Marshall’s care were treated with antibiotics only. In each case, there
was a permanent cure. Still, it took more than a decade for the acceptance
of Helicobacter pylori as the causative factor in gastritis and peptic
ulcer. The ultimate reward to Marshall and Warren for their years of
research came in the form of Nobel Prize in Medicine, 1905.

THE POPULATION EXPLOSION:


AN IMPACT OF BETTER HEALTH CARE
Since 1950s there has been a dramatic increase in world population. It
took the entire history of mankind to bring the population to 1 billion in
1810. In another 120 years, it doubled to 2 billion (1930). By the year
1975, it was 4 billion, and now it is over 6 billion. It has been projected
that the world population would increase by 1 billion every 11 years.
The main cause of population explosion is not any increase in birth
rates. Birth rates are actually declining in most of the countries, especially
in the affluent developed countries with very high literacy rate. It is the
severe fall in the death rates that caused the rapid increase in the world
population (Fig. 5.69).
For nearly 2000 years, the world population remained stable because
of the impacts of famines, epidemics, and wars. For example, in 14th
century, the great plague wiped out one-third of the world population.
In the absence of any knowledge of sanitation or any effective treatment
20th Century Medicine | 325

Fig. 5.69: Population explosion

of infections, about one–third of the infants died before reaching the age
of five. Even adults had a life expectancy of not more than 30-40 years.
By the year 1950, antibiotics as well as vaccines had been discovered.
Moreover, except cancer (and now AIDS), treatment was found for
almost all the ailments. Another major factor responsible for the
population explosion was the dramatic increase in food production
during the last 50 years, with fewer deaths due to starvation. To overcome
the problem caused by population explosion, birth control has been
adopted as a national policy in most of the countries of the world.
Contraceptive measures and abortions are as old as the history of
mankind. However, till recently, they were used to avoid unwanted
pregnancies, not for population control. Males wanted to enjoy sex but
326 | History of Medicine

did not want too many children, so as to avoid the division of property
into small holdings. Nor did they want a brood of illegitimate children
for fear of social stigma. Females resorted to these measures to avoid
illegitimate pregnancies. All these reasons are valid even today, but the
dread of “population bomb” exploding the economies of the developing
countries is the main reason behind official backing of birth control
programs.

DR WILLIAM H MASTERS LEARNS


SEXOLOGY FROM PROSTITUTES
When William H Masters (Fig. 5.70),
an American physician applied for funds
to do research in sexual dysfunction, the
Vice-chancellor of the university initially
advised him not to commit professional
suicide, but ultimately gave in. Thus, in
1954, he set up a laboratory with a bed,
ECG and EEG machines, a color movie
camera, a flood light and a bench for
biochemical investigations. He wanted
to record the responses in various parts
of the body, including the genitalia, during
sexual intercourse. Thinking that Fig. 5.70: William H Masters
and Virginia Johnson
“normal” people would not cooperate
in such an investigation, Masters initially started using prostitutes,
both males and females. “I learnt a lot of sexology from them,” Dr
Masters confessed. Among the prostitutes was a girl, a PhD in sociology,
who became a part-time sex worker to supplement her income! She tried
to convey certain sexual responses, which Masters found difficult to
20th Century Medicine | 327

grasp. Therefore, she advised him to have a female research collaborator


who could interpret the responses of the female subjects. The university
placement bureau found him a woman, Mrs Virginia Johnson, who
had no university degree but had the “necessary experience” by having
divorced three husbands. The two research workers continued the work
but the study on prostitutes did not progress well, since their sexual
responses were distorted because it was a part of their profession.
Consequently, they reluctantly advertised for “normal respectable
volunteers.” The response was over-whelming and totally unexpected.
As many as 700 men and women, black and white, young and old
applied. The eldest among them was an 89 years old man. The team of
Dr Masters and Mrs Johnson made remarkable observations on the
sexual function and dysfunction. In 1959, they started a clinic for the
treatment of sexual dysfunction. The collaboration in research went on
to extend into their personal lives. Masters divorced his wife and married
Mrs Johnson. Their sex clinic became so popular that their income
exceeded 1000,000 dollars per year. However, in 1993, Masters and
Johnson were divorced and their and 30 years collaboration in sex research
and therapy came to an end. Masters, now 78 years old has remarried to
woman he met 55 years ago while he was a medical student.

ALTERNATIVE MEDICINE
Alternative medicine describes practices used in place of conventional
(Western or Allopathic) system of medical treatment. The term
complementary medicine is used to describe the medical practices
used as adjunct to conventional medical practices.
According to a recent survey, about half of the general population in
the developed world is using complementary or alternative medicine
(CAM). The use of CAM seems to be increasing all over the world. In
the USA, a large number of medical schools are offering MD degree in
328 | History of Medicine

CAM. In the UK, no medical school offers courses that teach the clinical
practice of alternative medicine. However, alternative medicine is a part
of medical curriculum in many medical schools. In India, the practice of
Ayurvedic medicine is widespread and has patronage of the Central and
State Governments. In China, and many other South-Eastern Asian
countries, the Traditional Chinese Medicine including Acupuncture is
practiced. Besides these, other systems of alternative medicine include
homeopathy, hydrotherapy, meditation, naturopathy, etc.
What is the reason of the popularity of alternative medicine? To a
large population in the developing countries, this is the only health care
system available—it is not an alternative to any other system. However,
the popularity of alternative medicine in highly educated developed
countries is an enigma to most of the doctors trained in Western health
care system. To them, the alternative medical therapies have no rational
or scientific basis; their efficacy has never been proved by double-blind
trials. The advocates of alternative medicine argue that “the conventional
allopathic medicine is intimately tied to multibillion dollar
medicopharmaceutical-industrial complex, whose first priority is to make
money. That is why these doctors refuse to admit the beneficial effects
of alternative medicine.” To a patient, the first and the last priority is to
get well. Any system, conventional (allopathic) or alternative, that
relieves his misery is acceptable. Many practitioners of allopathic
medicine do not mind the use of alternative medicine as an adjunct to the
conventional medicine.
Ayurveda is an ancient Indian (Hindu) system of medicine that has
survived till date. Ayurveda deals with the practices for healthy living,
along with therapeutic measures that relate to physical, mental, social,
and spiritual harmony. After Indian independence in 1947, there has
been resurgence in the interest in Ayurvedic medicines. Various Indian
States and Central Government agencies are funding Ayurvedic medical
institutions. Ayurvedic practitioners have been appointed as Honorary
20th Century Medicine | 329

Ayurvedic Physicians to the President of India. A few of the medical


colleges offering MBBS degree have incorporated the Ayurvedic system
as a part of the curriculum.
Faith in Ayurvedic medicine in the Indian population can be judged
from the popularity of the Ayurvedic hospitals. Even in the West, many
pharmaceutical firms are interested in investigating the possible
therapeutic value of the herbs used in ancient Indian and Chinese systems
of medicine. In 1990, an American company started a 90 million US
dollar project for this purpose. Ten years later, when nothing came out,
the project folded up. This failure to find any useful herbal drug cannot
ignore the fact that in 1950, an Indian cardiologist, Dr Rustom Jal Vakil
had published in the British Heart Journal the clinically proved
antihypertensive action of an Ayurvedic drug, serpina, derived from a
plant Rauwolfia serpentine. The active principle of the herb, marketed
as reserpine, became the first safe drug against hypertension.
Turmeric, the spice used in Indian food, is claimed to have many
medicinal properties. At present, there is a world- wide research in the
therapeutic value of turmeric. Hundreds
of research publications support its value
in a variety of degenerative conditions and
even a role against cancer. Curcumin, an
active ingredient of turmeric has been
shown to act as an antioxidant. In addition,
it has been shown to have antibacterial
and antiviral activity. Curcumin seems to
inhibit virtually all stages of cancer
development in rodents.
Acupuncture is a component of the
Traditional Chinese Medicine (Fig.
5.71). Acupuncture flourished in China till
1932 when Chang Khi Chek took over Fig. 5.71: Acupuncture
330 | History of Medicine

power in China. He brought Western medicine to China and acupuncture


was banned at least in the cities. When Mao Tse Tung took over China,
in 1945, the traditional Chinese system of medicine was revived and the
Western system was banned. Today, China has more than 232,000
traditional Chinese medical doctors; some 30,000 are added every year.
In 1972, President Nixon of the USA visited China. During the visit,
a New York Times journalist developed appendicitis. He was operated
upon for the disorder under the effect of acupuncture as an anesthetic.
The journalists were also shown brain surgery using acupuncture rather
than anesthesia. These reports made headlines all over the world,
especially in the USA. Since that time, acupuncture has become an
important tool of alternative medicine in the West. In the USA, there are
about 8,000 acupuncturists. However the role of acupuncture as a
therapeutic tool is controversial. Acupuncture is claimed to be useful in
a large number of medical disorders. According to the researchers in the
National Institute of Health (USA), acupuncture seems to reduce pain
sensation by the release of endorphins in the brain. This is the only
scientifically possible role of acupuncture. The Annual Review of
Medicine, 2000, reported that acupuncture
is effective in the treatment of postoperative
and chemotherapy—induced nausea and
vomiting and for headache, and low back
pain. For the rest of the ailments there is no
evidence that acupuncture is effective.
Homeopathy is a system of medicine,
which was discovered and developed
by a German physician, Dr Samuel
Hahnemann (1755-1843) (Fig. 5.72).
In the 18th century, the medical science
was hardly better than it was two thousand
years ago. Blood-letting, leeching and Fig. 5.72: Hahnemann
20th Century Medicine | 331

purging were the chief therapeutic tools for most of the ailments. The
German physician Samuel Hahnemann was so disgusted with the
outcome of the treatment of his patients that he left medical practice and
started translating Cullen’s materia medica from English to German.
When he was translating the treatment of malaria, his attention was
arrested by the author’s remark that cinchona bark cured malaria because
of its bitterness and tonic effects on the stomach. Hahnemann asserted
that the efficacy of cinchona bark must be due to some other factor,
since there were other substances decidedly more bitter than cinchona
but not effective for malaria. As an experiment, he took repeated doses
of cinchona bark. Within a few days, Hahnemann is claimed to have
developed fever, chills, and other symptoms of malaria! Hahnemann
came to the conclusion that cinchona was beneficial in malaria because
it produced symptoms of malaria. Hahnemann spent the next six years
experimenting on himself, his family and a small but growing group of
followers. He came to the conclusion that a medicine cures a disease
only because it produced similar symptoms in healthy individuals. In
1796, he published his “Law of Similars” or “like cures like” in a respected
medical journal of Germany. The popularity of Hahnemann spread
because at that time there was no effective treatment for almost all the
ailments. Hahnemann’s medicines were as effective or ineffective, but
certainly less dangerous, than those administered by the mainstream
physicians. In those days, medicines were dispensed by a group of
“qualified” people called the apothecaries, not by the physicians. Since
Hahnemann’s system of treatment, called homeopathy by him, seriously
affected their income, the disgruntled physicians and apothecaries got
him arrested and expelled from the town in 1820. Hahnemann moved to
another town and obtained special permission to prepare and dispense
his medicines.
Despite stiff opposition from the mainstream medicine called
allopathy, (the term itself was coined by Hahnemann to describe the
332 | History of Medicine

traditional Western medicine), homeopathy flourished in the 19th and


early part of 20th century. By the year 1900, there were 22 homeopathic
medical schools and 1000 homeopathic medical pharmacies in the USA.
Homeopathy was equally popular in the UK and other European
countries, especially among the aristocracy. Homeopathy was
disproportionately popular among the women of that era. The first ever
homeopathic medical school in the world was the Boston Female Medical
College, founded in 1848.
By 1930s the popularity of homeopathy began to decline all over
the world, chiefly because the allopathic system of medicine began to
offer specific medicines like antibiotics and advances made in the surgical
treatment. A study published in the August 2005 issue of the Lancet,
contends that homeopathic remedies are no better than placebo. However,
even today, homeopathy has a sizeable following as an alternative system
of medicine.
Magnetotherapy was initially
developed by Franz Anton Mesmer
(1734-1815) (Fig. 5.73), who was a curious
figure in the world of psychic practitioners.
Mesmer started his career as a physician
in Austria. While bleeding a patient, a form
of treatment common in those days,
Mesmer noticed that the flow of blood-
stream increased as he approached the
patient and lessened noticeably when he
moved away. This single observation was
enough to convince the 40-year-old Fig. 5.73: Mesmer
handsome physician that his body must
be some sort of magnetic force. A wealthy English lady passing through
Vienna developed severe stomach cramps, which soon disappeared when
Mesmer laid a strong magnet to her belly. Next, a hypochondriac baron,
20th Century Medicine | 333

complaining of spasms in his back was referred to him. Within six days
of the magnetic treatment by Mesmer, the muscle spasms disappeared.
Fame was heaped upon Mesmer by the public for his magical
magnetotherapy, to the extreme annoyance of his mainstream physician
colleagues.
Long before Freud, Mesmer seemed to have realized that the effects
of sexual repression in young unmarried women present as hysteria and
a variety of other symptoms. Mesmer began to treat such women with
his magnetotherapy. He claimed to possess an “animal magnetism,”
which could cure patients with his intimate touch. As his popularity
grew, so did the outrageousness of his methods. For this treatment,
residence in his hospital was essential. His sensual healing technique
became the talk of the society. French King Louis XVI offered him a
lifelong handsome salary if he would sign a contract to remain in Paris
and furnish proofs of his method of treatment. Mesmer declined to offer
any proof and threatened to leave France for good. The King established
a Royal commission in 1784 to evaluate the claim of Mesmer on animal
magnetism treatment. The commission came to the conclusion that the
animal magnetism practiced by Mesmer was merely a method of
suggestion. It was an art of increasing imagination by degrees. Widely
ridiculed, Mesmer was forced to leave Paris. He moved on to Vienna,
where he again set up a clinic for animal magnetism therapy. Later he
was exiled from Vienna also.
Nowadays Mesmer is remembered for the science of hypnotism,
called mesmerism, though he never practiced it. The practice of
hypnotism was developed by one of his pupils, Marquis Puysgur, but
it came to be known as mesmerism.
These days, large numbers of websites are offering magnetic bracelets,
belts, mattresses, necklaces, undergarments for the treatment of a variety
of ailments. Their efficacy seems to be directly proportionate to the
credulity of the patient.
334 | History of Medicine

Hydrotherapy was devised in 1829 by an Austrian, Vincent


Priessnitz (1799-1851).
Hydrotherapy is probably one of the oldest forms of medical therapy.
Its use has been recorded as early as ancient Egyptian, Greek, and
Roman civilizations. Bathing in water with essential oils and flowers is
said to be one of the secrets of the beauty of Cleopatra, the famed
Egyptian queen. Bathing in hot water springs has also been an accepted
method of treatment for a variety of diseases. Today, hydrotherapy is
utilized in the treatment of arthritis, musculoskeletal disorders as well
as in physiotherapy of patients with cerebral strokes. The scientific
evidence does not always support the claim of the effectiveness of
hydrotherapy. Moreover, like many descriptive terms, the term
“hydrotherapy” is misleading. The benefit, if any, is because of the hot
or cold temperature of water, not by water as such.
Colon hydrotherapy is a bit different type of hydrotherapy. In
early 1900s, an American physician, John Kellogg is said to have cured
thousands of his patients with gastrointestinal disorders by colonic
irrigation. The popularity of colon therapy in that country reached its
peak in 1920s, to1940s. At that time, colon irrigation machines were
commonly seen, and regularly used in the hospitals and doctors’ offices
in the USA. Even today, there are Guilds of Colon Hydrotherapists in
various countries as well as an International Association of Colon
Hydrotherapists.
Naturopathy is another of the various types of alternative medicine.
It was developed and popularized by an American physician, Dr
Benedict Lust. In 1902, he founded the American School of
Naturopathy. During the third and fourth decade of the 20th century,
there was a tremendous interest in this form of treatment. Naturopathy
emphasizes the importance of the ability of the body to heal itself, if
given the opportunity. It stresses more on prevention of disease by
going close to the nature, rather than on cure. With the strides made by
allopathic methods of treatment, the public lost all interest in naturopathy
20th Century Medicine | 335

after 1940s. In India, it maintains a minor degree of acceptability, and is


being touted as a form of medical tourism. An organization in the USA
even offers a PhD degree in naturopathy.

THE IG NOBEL PRIZE


The Ig Nobel Prizes are a parody of Nobel Prizes awarded each year
since 1991—for discoveries that cannot, or should not be reproduced.
They are sponsored by the scientific humor journal Annals of Improbable
Research, a US publication. These prizes are a commentary on the silly
extent to which some scientists can go to see their name in print, as well
as, the quality of editors who publish such “research” papers. However,
the “official aim” of the awards is to increase the interest in science. A
list of the Ig Nobel Prizes winners in medicine is given below:

1991 Alan Klingerman For his pioneering work in antigas


liquid that prevents bloat, gassiness,
discomfort, and embarrassment.
1992 F Kanda et al For their pioneering study “Eluci-
dation of chemical compounds
responsible for foot malodor,”
especially for the conclusion that
people who think they have foot odor,
do, and those who don’t, don’t.
1993 James F Nolan et al For their painstaking research report
“Acute management of zipper-
entrapped penis.
1994 Richard C Dart On the discovery that electric shock
therapy does not help in the treatment
of rattle snake bite.
Contd...
336 | History of Medicine

Contd...
1995 Marcia E Buebel et al For their study entitled “The effect
of unilateral forced nostril breathing
on cognition.”
1996 James Johnston et al For their unshakable discovery
testified before the US congress that
nicotine is not addictive.
1997 Carl J Charnetski et al For their discovery that listening to
Muzak stimulates the immune system
that may help to prevent common
colds.
1998 Caroline Mills et al For their report of “A man who
pricked his finger and smelled putrid
for five years.”
1999 Arvind Vatle For “carefully collecting, classifying
the kind of containers his patients
chose when submitting urine samples.”
2000 Willibrord W Schultz For their illuminating report “Magne-
et al tic resonance images of male and
female genitalia during sexual arousal.”
2001 Peter Brass For his medical report on “Injuries
due to falling coconuts.”
2002 Chris McManus Report on “Scrotal asymmetry in man
in ancient sculptures.”
2003 Eleanor Magurie et al For presenting evidence that “The
brain of London taxi drivers are more
highly developed than that of fellow
citizens.”
2004 Steven Stack et al For “The effect of country music on
suicide.”
Contd...
20th Century Medicine | 337

Contd...
2005 Gregg A Miller et al For “Invention of artificial testes for
dogs.”
2006 Francis M Fesmire For his paper “Termination of
intractable hiccups with digital rectal
massage.”
2007 Dan Meyer and Brian For investigating side effects of
Witcombe swallowing swords.
340 | History of Medicine

HISTORY OF IMMUNOLOGY
The science of immunology was known to the people centuries before
the work of Louis Pasteur and Robert Koch demonstrated the existence
of microbes. People knew that those who survived many of the common
infectious diseases, usually did not contract same disease again. Beginning
around 1000 AD, the ancient Chinese practiced a form of immunization
by inhaling dried powder derived from the crusts of smallpox lesions.
Gradually, the practice spread to the Arabs, especially in Turkey. The
procedure was mainly restricted to young girls so as “to preserve their
beauty.” A British lady, who had stayed in Turkey with her husband,
and had her daughter immunized, tried to popularize the practice in
England. It was vehemently opposed by the Church. The clergy called
the practice “un-Christian” and against the will of the Lord. To complicate
the matter, there was no way to standardize the material to be inhaled or
inoculated with a sharp needle. Some time the procedure resulted in full
blown smallpox leading to disfigurement or even death.
Immunization against smallpox became acceptable (in spite of great
resistance from the Church) when Jenner started using material from a
lesion of cowpox. Within a few years it became a worldwide practice.
The real boost to the development of immunology was given by the
discovery of microbes by Louis Pasteur. In 1880, while experimenting
with the disease Chicken Cholera. Pasteur’s assistant inadvertently left
a flask of the organisms on a table and went on a month-long vacation.
On return, he continued with the experiment and injected the old bacteria
in a group of chickens. To his surprise, the chicken did not develop the
disease. Those chickens as well as a new batch of the birds were injected
with a fresh stock of the bacteria. All the chickens of the new batch
developed the disease and died within days but the first batch remained
“a picture of health.” He tried to infect them three times but the chickens
remained healthy. Pasteur concluded that the virulent chicken cholera
History of Development of Specialities | 341

bacteria had become attenuated by a prolonged exposure to air and


sunlight and provided immunity against infection even by the virulent
bacteria.
Some other methods for attenuation of the bacteria were soon
developed, e.g. by heat or by passing through an unnatural host. Pasteur
infected a rabbit with spinal cord of a fox that died from rabies. Being
unnatural host, the rabbit got infected but did not develop the full-
blown rabies. In this way the nervous tissue of rabbit contained an
attenuated rabies virus. When injected in a natural host like dog, such
virus gave immunity against rabies without producing the disease. From
this observation Pasteur went on to develop a vaccine for anthrax, a
deadly disease of cattle and then the antirabies vaccine. With the
establishment of the Pasteur Institute in Paris, by 1890s, vaccines against
anthrax and rabies were produced commercially and used in many
countries of Europe.

EMIL BEHRING
Diphtheria was once one of the most
dreaded diseases of childhood, with
frequent large scale epidemics. In some of
these epidemics, as many as 80 percent of
the children under the age of 10 died. The
name diphtheria is derived from the Greek
word “diphthera” for leather. It alludes to
the leathery, sheath-like membrane that
grows on to the tonsils, throat and in the
nose. The patient dies because of difficulty
in breathing and bacterial toxemia. In the
year 1891, Emil Behring (Fig. 6.1), a
Fig. 6.1: Behring
German physician working in the Koch’s
342 | History of Medicine

institute announced that the whole blood or serum of an animal, which


has been rendered immune to diphtheria or tetanus by injection of the
relevant vaccine, could be used to treat another animal injected with a
lethal dose of the bacteria. Since Behring initially used guinea pigs to
obtain the immune serum, only small quantities were available. However,
human trials started just after the announcement by Behring. Newspapers
were full of reports of miraculous effects of the serum therapy in children
suffering from diphtheria. By 1895, large quantities of antidiphtheria
serum became available when horses were used to raise the antitoxins.
Thus the serum therapy was born. Soon, serum therapy was available
for other infectious diseases like tetanus, typhoid, plague, and cholera.
(The use of antitetanus serum became more popular after its benefits
were noted in soldiers wounded in World War I.) Behring received the
first Nobel Prize in Medicine, in 1901.
The availability of the antidiphtheria serum could not prevent the
outbreak of diphtheria epidemics. In 1920s, there were an estimated
100,000 to 200,000 cases of diphtheria per year in the USA alone, with
13,000 to 15,000 deaths per year. The success in the treatment of
diphtheria came with the discovery of sulpha drugs following the World
War II. The first successful vaccine for diphtheria was available in 1923,
but large scale mass vaccinations against diphtheria became a reality
when WHO took up the program in 1960s. Even in 1990s, 200,000
cases of diphtheria were reported in former states of the USSR, with
5000 deaths.

ELI METCHNIKOFF
The presence of white cells in the blood was known since long but as
late as 1880s, no function could be assigned to them. In 1882, a Russian
microbiologist, Eli Metchnikoff (Ilya Mechnikov) (Fig. 6.2), for the
first time, observed the phenomenon of phagocytosis. While
History of Development of Specialities | 343

experimenting with the larvae of starfish,


he observed amoeba-like cells in the
organism attracted to and seeming to ingest
foreign substances like fungi. He had already
done much work on evolutionary
relationships, and in a fit of inspiration, it
occurred to him that the amoeba-like cells
in the starfish might be primitive analogues
of the pus cells seen in the inflammatory
response in higher animals. Microscopic
observations on animals infected with
various organisms revealed white blood cells Fig. 6.2: Metchnikoff
attacking, ingesting, and killing the germs. Based on these observations,
Metchnikoff presented a radically new theory that certain white blood
cells could engulf and destroy harmful bodies such as bacteria. The
European experts in microbiology like Pasteur and Behring scorned the
humble Russian and his theory. Koch went to the extent of suggesting
that white blood cells actually provided the vehicle whereby germs spread
throughout the organism! However, Pasteur offered Metchnikoff a post
at his Paris Institute to carry forward his work, which was accepted. It
took two decades for the world to accept the phagocytic function of the
white blood cells and in 1908; Metchnikoff was awarded Nobel Prize
in Medicine for this work.
Medical research has often been accelerated when a dispute arises
between two opposing schools of thought. Each group of scientist tries
to produce evidence in favor of its own theory. One such example is the
cellular versus humoral mechanism of protection against bacterial
invasion of the body. Whereas Metchnikoff and a few colleagues were
strongly in favor of the role of white blood cells (at that time called the
cellular theory), the humoral basis of protection was supported by a
larger number of scientists, especially Paul Ehrlich who demonstrated
344 | History of Medicine

that the immunity to diphtheria and tetanus was provided by circulating


antibodies. After more than two decades of debates between the two
groups, the two theories received international recognition with the
award of Nobel Prize in Medicine, 1908 to both Metchnikoff and
Ehrlich. (These days the cellular immunity has a different meaning, i.e.
the role of T-lymphocytes).

CHARLES RICHET AND OTHER


NOBEL LAUREATES IN IMMUNOLOGY
Charles Richet (1850-1935) was awarded Nobel Prize in Medicine,
1913 for his work on anaphylaxis. Richet was a French physician and
physiologist. While cruising on the yacht of the Prince of Monaco, he
got busy in studying the effects of marine invertebrate poisons on
mammals. He observed that the effect did not depend as much on the
toxic properties of the substance as on its function as an antigen in the
previously sensitized animal. Thus, for the first time the phenomenon
of anaphylaxis was studied. He showed that the protective mechanisms
of immunity might also cause disease. Later on, with the demonstration
of the relation between the experimental anaphylaxis and human allergies,
the clinical importance of Richet’s work became obvious.
Jules Bordet (1870-1961) was a Belgian physician and a colleague
of Metchnikoff at the Pasteur Institute in Paris. He demonstrated the
mechanism of complement-mediated bacteriolysis. This discovery led
August von Wasserman and his colleagues to develop the complement-
fixation test for syphilis. Bordet received the Nobel Prize in Medicine,
1919.
Karl Landsteiner (1868-1943) was a Viennese physician. While
studying antierythrocyte antibodies, in 1901, he discovered isoigglutinins
that now comprise the ABO system of blood groups. He discovered the
MN system of blood groups in 1926 and Rh blood group system in
History of Development of Specialities | 345

1940. For the discovery of blood groups, Landsteiner was awarded


Nobel Prize in Medicine, 1930.
Max Theiler (1899-1972) was a South African who studied
medicine in Britain and then joined the School of Tropical Medicine at
Harvard, USA. It was he who showed that the yellow fever, rampant in
Africa at that time, was caused by a filterable virus. In 1930s he was able
to develop attenuated strains of yellow fever virus by serial passage in
vitro in mouse and chick embryo tissue cultures. From these strains, he
was able to develop attenuated viruses that retained their antigenicity,
but were devoid of pathogenicity. Thus the vaccine against yellow fever
was made available. He received Nobel Prize in Medicine, 1951 for
this work.
Daniel Bovet (1907-1992) was a Swiss physiologist and
pharmacologist. By 1940s it was known that histamine was one of the
important mediators of allergic reactions. Bovet was awarded the Nobel
Prize in Medicine, 1957 for the development of antihistaminic drugs
for the treatment of asthma and hay fever.
Macfarlane Burnet (1899-1985) an Australian, and Peter B
Medawar (1915-1987), a British, were awarded Nobel Prize in
Medicine, 1960 for their work on “acquired immunological tolerance.”
Medawar, trained in zoology and pathology at Oxford, was assigned the
task of finding out the cause of rejection of skin grafts from another
human. He demonstrated that the graft rejection was based on the same
mechanism as responsible for protection against bacterial and viral
infections. Earlier, in 1945-1947, Ray Owen had reported a curious
observation that dizygotic cattle twins did not reject grafts between
each other. Based on these observations, Burnet came out with the
hypothesis that immunological responses arise fairly late in the
embryological life. It involves a cataloguing by a system of antigens
present (self-antigens), to which the host henceforth would be tolerant
and unable to respond immunologically in postnatal life. Any antigen
346 | History of Medicine

not so catalogued would be considered nonself antigen and produce an


immunological response.
Rodney R Porter (1917-1985) a British and Gerald M Edelman
(born 1929), an American, were awarded Nobel Prize in Medicine,
1972 for the discovery of the chemical structure of antibodies.
Rosalyn S Yalow (born 1921), an American was awarded Nobel
Prize in Medicine, 1977 for developing the radioimmunoassay
technique for the estimation of peptide hormones in the concentrations
of nanograms or even picograms per ml of plasma. This was a great
stride in the basic and clinical endocrinology.
Baruj Benacerraf (born 1920) (USA), Jean Dausset, (born
1916), France and George Snell (USA) were awarded Nobel Prize in
Medicine, 1980 for the discovery of HLA antigens, now known as the
major histocompatibility complex ( MHC).
Cesar Milstein (born 1927) (GB), George Kohler (1946-1995)
(Germany), and Niels K. Jerne (1912–1994) (Denmark) were awarded
Nobel Prize in Medicine, 1984 for the development of monoclonal
antibodies.
Susumu Tonegawa (born 1939) (Japan) was awarded Nobel Prize
in Medicine, 1987 for his work on the molecular biology of
immunoglobulin genes, demonstrating how antibody diversity is
generated.
Peter Doherty (born 1940) (Australia) and Rolf Zinkernagel
(born 1944) (Switzerland) were awarded Nobel Prize in Medicine,
1996 for their work on the role of MHC in immune responses. They
demonstrated that the T-cell receptor is so constructed that it is able to
bind tightly to a polypeptide breakdown product of the virus, which
lies in a special cleft characteristic of the surface of the MHC.
History of Development of Specialities | 347

DISCOVERY OF T- AND B-LYMPHOCYTES


Till the middle of 20th century, nothing substantial was known about
the function of lymphocytes and their central role in immunology. Due
to the paucity of the cytoplasm, the lymphocytes cannot have any
significant function, it was argued. At best, lymphocytes were considered
to be the pleuripotent hemopoietic stem cell. The role of the thymus
and the Bursa of Fabricious in immunity was discovered before the
discovery of T- and B-lymphocytes.
Robert Good, an American surgeon was intrigued by the observation
that patients suffering from different types of cancer seem to contract
different types of infections. Patients with Hodgkin’s disease were
more liable to viral or fungal infections or tuberculosis, whereas patients
of multiple myeloma were more susceptible to bacterial infections.
Dr Good had the feeling that there might be two types of immunity, but
he had no evidence. He also investigated 45 cases of thymomas and all of
them were found to have immunological deficiency. It was the first time
that the possible association between thymus gland and immunity was
suggested.
The role of Bursa of Fabricius in immunity came to be known by a
laboratory accident. Timothy Chang, one of the postgraduate students
of a veterinary scientist, Professor George Jaap was assigned the
responsibility of finding out the function of the Bursa, a collection of
lymphoid tissue situated near the cloaca in the birds. For this purpose,
he was surgically removing the bursas in chicken so as to find out the
effect on the body function, if any. Another student, Bruce Glick was
trying to demonstrate the development of gammaglobulins in the chicken
on vaccination with Salmonella bacteria. Glick was intrigued to find that
of the two batches of chickens vaccinated, one batch showed development
of a high concentration of antibodies whereas the second group did not
show any rise. On further investigation, Glick came to know that
348 | History of Medicine

technician in charge of the animal house had by mistake given Glick


some birds in whom the Bursa had been removed by Chan. Initially
Glick was angry with the laboratory technician for giving him the
“defective” birds. Slowly the importance of the discovery dawned on
him. Then, the two postgraduate students tried to publish their results,
but no medical journal was willing to publish “such an absurd work.”
Ultimately the paper was published in Poultry Science. The article was
brought to the notice of Dr Good. Now Dr Good felt bold enough to put
forward his two-component theory of immunity (cellular and humoral)
in late 1960s. Subsequent work confirmed the existence of two types of
lymphocytes. Those which required the thymus for their maturation
were called the T-lymphocytes. Other group of lymphocytes required
the Bursa of Fabricius for their maturity and therefore called
B-lymphocytes. In humans, there is no Bursa of Fabricius and the
second group of lymphocytes attains maturity in the bone marrow.
Therefore the name B-lymphocytes remained appropriate even in
humans.

HISTORY OF OPHTHALMOLOGY —
ANCIENT CONCEPTS OF ANATOMY OF THE EYE

The structure of the eye was not known till 16th century AD.
Hippocrates, Aristotle and Galen believed that the eye contained a fluid,
the medium of vision that flowed from the eye to the brain through a
tube. The presence of lens in the eye was believed to be a postmortem
phenomenon. Later on the lens was found to be a normal component of
the eye, and the anterior chamber and posterior chambers were
recognized. The anterior chamber was said to contain a watery fluid,
whereas the posterior chamber was said to contain an egg-white-like
substance. The anatomy of the eyeball was properly described for the
History of Development of Specialities | 349

first time by Vesalius (1552 AD). However


his drawings showed the position of the
lens in the middle of the eyeball (Fig. 6.3).
It was Fabricius (1600 AD), a pupil of
Vesalius who presented a drawing of the
eye in which the lens was shown in the true
position, i.e. directly behind the iris. In the
18th century, using a microscope,
Leeuwenhoek, a cloth merchant-turned
scientist, discovered the histological
structure of the rods and cones in 1722, but Fig. 6.3: Ancient concept
no one took any notice till 1834 when the of the eyeball
presence of the rods and cones of the retina was rediscovered by
Treviranus. Still, the function of the eye remained a mystery. For example,
at that time the lens was perceived to be the seat of vision, not a tool of
vision. The change in the pupillary size was explained to be because of
a change in the amount of blood present in the blood vessels of the iris.

THOMAS YOUNG
Thomas Young (1773-1829) (Fig. 6.4), a
British physician, is remembered for his
pioneering work in optics. Thomas Young was
trained both as a physician and a physicist.
He established himself as a physician in
London. His interest in research led to many
discoveries in physiology and optics of the
eye. Initially he published his research work
anonymously to protect his reputation as a
physician. Fig. 6.4: Young
350 | History of Medicine

Even as a medical student he had discovered the way by which the


lens of the eye changes its shape to focus on objects at different distances.
This simple but elegant experiment consisted of observing the images of
a candle reflected from the cornea as well as from the anterior and
posterior surfaces of the lens. When looking at a near object, the image
of the candle on the lens decreased in size but there was no change in the
image from the cornea. He concluded that the convexity of lens increases
when looking at a near object. This was a remarkable discovery in the
physiology of the eye for those times. However regarding the mechanism
of accommodation, Young was far off the track. He considered the lens
to contain contractile fibers and spent many years trying to locate the
nerve fibers in the lens, the action of which, he supposed, caused a
change in the shape of the lens.
Young’s another achievement in optics was the proposal of the
wave theory of light. By passing a beam of light through two parallel
slits in an opaque screen, he showed the alternating light and dark bands
on a white surface beyond the slits.
As early as 1790, prior to the discovery of cone cells in the retina,
Young introduced the original theory of color vision. It was based on the
assumption that there are three groups of color-sensitive cells in the
retina, responding to red, green and blue colors respectively. Different
colors are perceived by stimulation of these color-sensitive cells in
different proportions. He also calculated the approximate wavelength
of the seven colors of the spectrum. This concept was further developed
by Herman von Helmholtz. The theory is remembered as the Young
Helmholtz theory of color vision. This theory was experimentally proved
in as late as 1959.
History of Development of Specialities | 351

HERMANN HELMHOLTZ
Hermann von Helmholtz (1821-1894) (Fig.
6.5) was a German physiologist whose
invention of the ophthalmoscope marked the
beginning of the modern era in ophthal-
mology.
In the school, Helmholtz did well in
mathematics and was outstanding in physics.
He desired to become a physicist but his
father’s meager salary was insufficient to
support his university education. His father
advised him to join an army medical school
where students were given free education. In
return, on completion of the studies, the Fig. 6.5: Helmholtz
students were to serve for eight years in the German Army. Thus he
became a physician out of financial compulsions. While in the medical
school, Helmholtz came in contact with the most famous German
scientists of that era such as Emil Du Bois-Reymond, Johannes Muller,
and Rudolf Virchow. Probably contact with these eminent men aroused
his life-long interest in physiological research.
On completion of his compulsory army service, Helmholtz became
a professor of physiology and served in many German universities. The
ophthalmoscope was invented by Helmholtz, in 1850, early in his career
as physiologist. The instrument was developed by him merely to explain
to the students why the pupil of eye looks black most of the time but
looks bright red on other occasions. His aim was to show that light is
emitted from the pupil following the course it enters. In the bargain he
was able to see an image of the retina. It was the first time that the
condition of the retina was visualized in a living individual. Helmholtz
was quick to realize the significance of the discovery, and he presented
352 | History of Medicine

Fig. 6.6: Helmholtz’s eye mirror

his crude ophthalmoscope before the Berlin Physical Society in 1850. It


consisted of a lens glued to a glass cover slip (used in microscopic speci-
mens). The light source was a candle. The cover slip acted as a mirror to
reflect the rays of light into the pupil as well as transparent enough for
the observer to see through them. The instrument (called eye mirror by
him) was not immediately liked by his contemporary ophthalmologists
(Fig. 6.6). Their fear was the possible retinal damage caused by direct
exposure to the “strong rays of light.” Moreover the part of retina
visualized was rather small. However some well-known ophthalmo-
logists of that era, notably Donders, Graefe, Jaeger, and Bowman started
the use of ophthalmoscope in their patients. When von Graefe saw the
retina for the first time using Helmholtz’s ophthalmoscope, his cheeks
reddened, and he called out excitedly, “Helmholtz has unfolded to us a
new world.” By 1860s, the use of ophthalmoscope for the diagnosis of
disorders of the eye became so important that the ophthalmologists
took pride in using the instrument and called themselves
“ophthalmoscopists.”
Besides the invention of the ophthalmoscope, Helmholtz is
remembered for many other accomplishments such as development of
History of Development of Specialities | 353

the theory of color vision initially enunciated by Thomas Young, the


doctrine of conservation of energy and work on the acoustics.

FRANS DONDERS
Frans Cornelis Donders (1818-1889)
(Fig. 6.7) a Dutch physician is another well-
known name in the history of ophthal-
mology.
In 1842, he became a lecturer at the
Military Medical School, where he
translated German scientific books into
Dutch to supplement his income. In this
way he was introduced to the physiology
of vision, and realized that little was known
on the subject at that time. In 1847, he was
made Professor Extraordinaire of a Fig. 6.7: Donders
university medical school, where he continued with his investigations in
the field of ophthalmology. In 1851, he attended the Great Exhibition in
London and came in contact with two famous ophthalmologists of that
era, William Bowmann (1816-1892) and Albrecht von Graefe (1828-
1870), with whom he developed a life-long friendship. He heard about
Helmholtz’s ophthalmoscope for the first time in London. Almost
immediately, he acquired the instrument and started using it in all his
patients with unexplained visual complaints. He started receiving patients
with such complaints from all over Netherland. In early 1850s, Donders
started a small eye clinic. By 1858, the clinic had grown to a big
Netherlands Hospital for Necessitous Eye-patients.
Before his time the refractive errors were classified according to the
correcting lens required; the patient was diagnosed to be suffering from
myopia if the vision improved by a concave lens, and presbyopia, if the
354 | History of Medicine

patient needed a convex lens. The “puzzling” thing about presbyopia


was that occasionally it occurred in young people. The disorder was
then named as “old sight of young people.” It was Donders, who
recognizes the condition of hypermetropia. He also clarified that though
both presbyopia and hypermetropia were corrected by a convex lens,
the former was due to a defect in the accommodation of the eye, the
latter was due to a shortening of the length of the eyeball (an antithesis
of myopia). The concept and the term emmetropia also came from
Donders. Donders is also credited with the discovery of the error of
refraction called astigmatism, caused by irregularity in the curvature of
the lens or the cornea. He also devised the cylindrical lenses for its
treatment. Thus, Donders was able to detect all types of errors of
refraction and was able to provide the appropriate lenses for each type.
Donders also was able to measure intraocular pressure to diagnose
glaucoma using a tonometer devised by him.
Besides ophthalmology, Donders made many significant discoveries
in cardiovascular physiology. He was the first to take a graphic record of
the heart sounds and murmurs on a revolving drum (phonocardiograph).
He was the first to demonstrate the effect of variations in the heart rate
on the duration of ventricular systole and diastole. In fact, Willem
Einthoven, the key developer of electrocardiography, had spent time in
Donder’s research laboratory.

VON GRAEFE
Friedrich Wilhelm Ernst Albrecht von Graefe (Fig. 6.8) (1828-1870)
was an outstanding German ophthalmologist. When he received his
medical degree in 1847, von Graefe was not sure in which branch of
medicine to specialize. During a visit to Prague, he was strongly
influenced by the professor of ophthalmology, Carl Ferdinand.
Subsequently during the next two years, Graefe worked in the famous
History of Development of Specialities | 355

ophthalmologic clinics in Paris, Vienna and


London. In London, he came in contact
with Franz Donders, with whom he
formed a life-long friendship. There he
was introduced to the ophthalmoscope
invented by Helmholtz. He was the first
ophthalmologist to use the ophthalmo-
scope regularly in his patients.
On returning to Berlin in 1850, at an
age of 22, he founded his own eye clinic
and soon had a surprisingly busy practice.
Graefe’s great care for his patients as well Fig. 6.8: Graefe
as his scientific skills earned him a name not only among his patients but
also the contemporary ophthalmologists of Europe. He maintained
friendly and scientific associations with Hermann Helmholtz, Franz
Donders, Eduard von Jager, William Bowman and Friedrich Horner.
Among many of his students, Argyll Robertson and Theodor Billroth
became famous in their own right.
He was a pioneer in mapping of the visual field defects and diagnosis
and treatment of glaucoma by iridectomy. He improved upon the method
of cataract extraction devised by French ophthalmologist, Jacques Daviel
in 1748.Graefe’s method remained the standard procedure for cataract
surgery for the next 100 years. He also rationalized the use of many
important drugs used in ophthalmic practice, like mydriatics and miotics.
He was the first to describe the visual impairment because of retinal
artery embolism and optic neuritis. In 1854 von Graefe, aged 26, founded
the first medical journal of ophthalmology (“Archiv fur
Ophthalmologie”). In the first issue of the journal, 400 of the 480 pages
were authored by him. Graefe was the founder of one of the earliest
ophthalmic societies, “German Ophthalmologic Society” in 1857. He
died of pulmonary tuberculosis in 1870, at the age of 42.
356 | History of Medicine

HERMAN SNELLEN
Herman Snellen (1834 -1908) was a Dutch general medical practitioner
and a close friend of Frans Donders. He shifted to the practice of
ophthalmology and soon became an excellent ophthalmologic clinician,
surgeon and a research worker. Donders and Snellen worked together on
many research projects on the diseases of the eye. While Donders was
a scientist, Snellen was a more practically
oriented and excellent surgeon. Snellen rose to
become professor of ophthalmology in one of
the famous universities of Netherlands.
Snellen is mostly remembered for the
development of Snellen eye chart to quantify
the degree of visual impairment of an individual
(Fig. 6.9). Now well-known Jaeger’s charts
for near vision were developed in 1854 by
Eduard von Jaeger, a German ophthal-
mologist. He developed the charts in several
languages and found widespread popularity. Fig. 6.9: Original
Jaeger got all these charts printed in the State Snellen chart
Printing House in Vienna. The Jaeger numbers (1-6) referred to the item
numbers in the Printing House Catalogue and have no numerical meaning.
Since no standards have ever been set, now the print size # 4 in one card
may be equivalent to print size # 6 of another printing press.
Snellen’s charts for testing the acuity were developed by Herman
Snellen on a special request by his friend and ophthalmologist, Franz
Donders. Because of the scientific bent of mind, the Snellen’s charts
were prepared on a more scientific basis than the Jaeger’s charts. Snellen
did not use the existing type faces. Instead, he designed a special type
face in which, when seen from the given distance, each letter casts an
angle of five minutes on the eye. Further, each limb of the letter casts
History of Development of Specialities | 357

and angle of one minute. These values were obtained by examining a


large number of normal individuals. Donders and Snellen did a population
study and realized that 20/20 vision did not reflect the maximum acuity
possible in humans. It merely represented the visual acuity of most of
the normal individuals. These results were published along with the first
Snellen’s charts in 1862. It is interesting to note that in a study conducted
about 150 years later (1995), no change was detected in the visual acuity
of the general population.
After Donders, Snellen became the professor of ophthalmology in
the same university. Besides the charts named after him, Snellen was
responsible for a very comprehensive work on many disorders of the
eye like astigmatism, glaucoma, and retinal diseases.

CATARACT SURGERY
Cataract, the word used to describe opacification of the crystalline lens
comes from the Greek for waterfall.Untill the middle of 18th century; it
was thought that a cataract was formed by opaque material flowing, like
a waterfall, into the eye. The earliest written reference to cataract surgery
is found in Sanskrit manuscripts of 400-500 BC.
Sushruta (4th century BC Indian Ayurvedic surgeon) is credited
with a written account of about 72 diseases of the eye. He is believed to
be the first ophthalmic surgeon to perform cataract surgery. His book,
Sushruta Samiksha describes a number of instruments for a type of
cataract surgery called couching. By couching, the opaque lens was
pushed back into the vitreous cavity in the back of the eye (Fig. 6.10).
In the Middle Ages, couching of the cataract has been described by
many Arab physicians.
Modern cataract surgery, in which the cataract is actually extracted
out of the eye, was introduced by Jacques Daviel in Paris in 1748.
358 | History of Medicine

About 100 years later, the


technique was further refined by
the German ophthalmic surgeon,
Albrecht von Graefe and called
extracapsular extraction of
cataract. The technique required
a large (10-12 mm) incision
around the cornea which had to
be stitched. The large incision
took 6-8 weeks to heal.
Moreover, the stitches disturbed
the normal curvatures of the Fig. 6.10: Indian couching
cornea, leading to postoperative
astigmatism. Till 1884, the operation was performed with the help of
one or two strong assistants who kept the head of the patient still while
surgery was performed. With the discovery of cocaine as a local
anesthetic, the cataract surgery became a painless procedure. The latest
and now most preferred technique in cataract surgery is the
phacoemulsification developed by an American ophthalmologist,
Charles Kelman in 1968. Through a small (3 mm) incision, a fine
instrument is introduced into the eyeball. The tip of the instrument
gives off localized high frequency waves that breakup the cataract into
very fine fragments that are sucked out through the same tip. This
operation does not require any stitch. The patient can resume his normal
activity much sooner than the traditional cataract surgery.
In the traditional cataract surgery, the operated eye became aphakic
(without the lens). Hence the patient requires thick convex lenses to
restore normal vision. The problem was solved by the development of
intraocular lens implantation.
History of Development of Specialities | 359

HAROLD RIDLEY
Intraocular lens (IOL) transplant
surgery was developed by Sir Harold
Ridley (Fig. 6.11), a British ophthalmologist
in 1949. He had to fight for many decades
with the other British ophthalmologists,
before the procedure was finally accepted
in the UK and later in other countries.
Ridley was working with Royal Air
Force casualties during World War II. He
noticed that when splinters of Perspex from
aircraft cockpit canopies became lodged in
the eyes of wounded pilots, they did not
Fig. 6.11: Ridley
trigger rejection. Therefore, he proposed that
an artificial lens made of Perspex can be implanted in patients whose
cataract has been removed. He first Perspex lens was implanted in 1950.
By 1960s, the quality of the IOL was further improved and he started
using the artificial lens in all cases of cataract surgery. The technique was
not to the liking of other top guns in British Ophthalmology, particularly,
Sir Stewart Duke-Elder, the Queen’s eye surgeon.
The lack of recognition of Ridley’s achievement was due to the fact
that some clinical complications occurred at the hand of some
ophthalmologists, mainly because of variations in the surgical technique
adopted by them. As a result, other eye surgeons were not willing to
take chances with the vision of the patient. Moreover, Ridley was not
an aggressive advocate of his procedure. Besides these factors, the politics
among the ophthalmologists, particularly the jealousy of Duke-Elder
proved an impediment in the acceptance of IOL in the UK. In the USA,
IOL was allowed to be used only in 1981. Since its inception, IOL has
benefited over 60 million patients worldwide Finally, Ridley was honored
360 | History of Medicine

by the British Government in the year 2000, when he was a knighted,


less than one year before his death.
Sir Stewart Duke-Elder (1899-1978) a Scot, was a dominant
force in the British and international ophthalmology for more than a
quarter of a century. He is best remembered as a talented and prolific
writer who wrote a seven-volume Textbook of Ophthalmology and a
fifteen-volume System of Ophthalmology, along with other books that
provided the educational foundation for most of the world’s
ophthalmologists. He was knighted and showered with many other
honors.

HISTORY OF OTORHINOLARYNGOLOGY—
HISTORY OF LARYNGOSCOPE

The first mention of larynx can be found in the writings of Aristotle (350
BC). In early 16th century, Leonardo Da Vinci gave a complete description
of the structure and function of larynx. Giovanni Morgagni (1682-1771)
described the larynx in still greater detail. Ferrein (1741) coined the term
vocal cords. He considered the vocal cords to be comparable to strings
of a violin, activated to vibrate by the stream of air coming out of the
trachea. As early as 1745, Bertin pointed out that the structures were
folds of mucous membrane and hence should be called vocal folds.
However, though its official anatomic name is vocal folds, even today
the structure is known as the vocal cords in the medical literature. All
these studies on the larynx were based on the dissection in cadavers.
There was no method to visualize the larynx in a living individual.
In 1829, as a medical student, Benjamin Guy Babington developed
the first “glottiscope.”It resembled chopsticks with spatulas on their
ends. One spatula depressed the tongue, while the other was positioned
along the palate, reflected the sunlight for illumination of the vocal
cords. If he ever saw them is not recorded.
History of Development of Specialities | 361

Manual Garcia (1805-1906) is generally given the credit for the


development of the first laryngoscope. Garcia was a professor of singing
at the Royal Academy of Music in London. In 1854, strolling in Paris,
he observed the sun’s image reflected in the window pane of a store.
Next moment he purchased an easily available dentist’s mirror. In
combination with a hand-held mirror to reflect the sunlight, he was able
to visualize his own larynx and trachea. The feat was partly possible
because as a singer, he had a good control on his vocal cords and more
importantly, an absent gag reflex. The discovery, called “auto-
laryngoscopy” was presented to the Royal Society in May, 1855. Garcia’s
actual aim was to understand how the vocal cords could produce such a
variety of musical sound. As a result of this discovery, he was awarded
an honorary medical degree and many international distinctions.
Almost at the same time, Ludwig Turck, a neurologist in Vienna,
developed a similar technique. He performed laryngoscopy in a large
number of patients, but the technique was found useless in cloudy days.
Johann Czermak, a physician and a physiologist from Budapest
borrowed the instrument from Turck and started performing
laryngoscopy using a table lamp as the source of light. Czermak published
the technique and demonstrated it widely. Since both of them claimed
credit for the laryngoscopic technique, a protracted public debate ensued
(nicknamed “Turckisk war” by the press). Ultimately, the matter was
settled when both of them were jointly awarded a prize by the Science
Academy of France. By 1870 a laryngology clinic was established in
Vienna and minor surgical procedures were performed in the glottis.
Since local anesthetics had not been discovered. At that time, the patient
had to be first trained to suppress his gag reflex.

MORELL MACKENZIE
Morell Mackenzie (1837-1892) (Fig. 6.12), a British physician, was
one of the pioneers of the specialty of laryngology in the UK. He learnt
362 | History of Medicine

laryngoscopy from Czermac and


established London’s first Throat
Hospital in 1863. Mackenzie was rapidly
recognized throughout Europe as an
authority on diseases of the throat. He
was one of the founders of the Journal
of Laryngology and Rhinology and of the
British Rhino-Laryngological Asso-
ciation. Mackenzie also published
several books on the laryngoscopy and
diseases of the throat.
So great was his reputation that
Fig. 6.12: Mackenzie
when the crown prince of Germany (later
the Emperor Frederick III) developed a throat problem, Mackenzie was
called in for a second opinion. The disorder had been diagnosed as cancer
of the throat by the German doctors, and surgery had been advised.
Three days before the surgery, based on his clinical examination, and on
the biopsy report of Rudolph Virchow, (the famous German pathologist
of that era), Mackenzie declared the lesion as benign. Mackenzie’s
opinion prevailed and surgery was abandoned. Morell Mackenzie was
knighted in 1887. The very next year, soon after the crown prince
became emperor, the cancerous lesion of the throat flared up and killed
him in June, 1888. This was followed by a bitter quarrel between Morell
Mackenzie and the German doctors. The German doctors published an
account of the illness of the prince, to which Mackenzie replied by a
publication entitled “The Fatal Illness of Fredrick Nobel.” In this article,
Mackenzie argued that the lesion was initially benign but malignancy
developed later due to the effect of irritant medicines applied by the
German doctors!
Prosper Méniére (1799-1862) was a French physician who is
remembered for the first description of the disease (attacks of vertigo,
History of Development of Specialities | 363

nausea, vomiting and deafness) in 1861 and subsequently named after


him. He attributed the disease to a disorder of the inner ear. It was the
first time that a disorder of the inner ear was described but the medical
fraternity took no notice. The reason was that earlier (1822), another
French physiologist, Marie-Jean-Pierre Flourens had studied the effects
of localized lesions on the brain of experimental animals like rabbits and
pigeons. He observed loss of equilibrium and motor incoordination on
removal of the cerebellum. The local physicians had more faith in experi-
mental evidence than in conjectures. Moreover, deafness, in general,
was known to be incurable, hence no one was interested on the subject.
It was much later that Méniére got the credit for his achievement.

ROBERT BÁRÁNY
Robert Bárány (1876-1936) (Fig. 6.13) was
an Austrian physician. After his graduation
in 1900, he became a pupil of Sigmund Freud,
but subsequently joined the ear clinic
established by Adam Politzer (1835-1920),
the founder of otology in Austria. In this
clinic, while syringing the ear of a patient
with water, he was intrigued to observe
rhythmic oscillations of the eye- ball, now
known as nystagmus. Since the water was
quite cold, he repeated the syringing with
warm water. The nystagmus occurred again Fig. 6.13: Bárány
but in the opposite direction. Each time the patient complained of
vertigo as well. After many painstaking observations, Bárány was
convinced that the internal ear was not only involved in hearing but also
served an important role in the maintenance of equilibrium (as earlier
claimed by Prosper Méniére).
364 | History of Medicine

Bárány described in detail the function of the various semicircular


canals. Bárány’s most important contribution was the clinical application
of his own and other worker’s experimental data on the human equilibrium
system. He devised a chair mounted on a rotating axis (Bárány’s chair)
for the clinical assessment of vestibular function. The Bárány’s caloric
stimulation of the vestibular function was found still more useful since
it could be performed at the bed of the patient.
For the above mentioned work, he was awarded by the Nobel
Prize in Medicine, 1914. However, he was unable to receive the award
since only that year he had joined the army and was held as a prisoner of
war by the Russian army. On the intervention of the Prince Carl of
Sweden, he was released by the Czar, which enabled him to receive the
award in 1915. Back home, Bárány faced the jealousy of his Austrian
colleagues to the extent that he shifted to Sweden, where he spent the
rest of his life.
Hermann von Helmholtz, more famous for his invention of the
ophthalmoscope, made important discoveries in acoustics. In 1868, he
described the principles of the theory of Impedance Matching
(transmission of sound waves from low density air to high density fluid
medium of the internal ear), as well as the Resonance theory of Hearing
also known as the place principle. Helmholtz proposed that different
parts of the basilar membrane of the cochlea were held taught like the
strings of a musical instrument. In response to different wavelengths of
sound, different regions of the basilar membrane are set into resonant
vibration. High frequency sounds set the basilar membrane at the base
of the cochlea, whereas low frequency sounds made basilar membrane at
the apex of the cochlea into vibration.

GEORG VON BÉKÉSY


Georg von Békésy (1899-1972) (Fig. 6.14) was a Hungarian engineer
who devised anatomic techniques that allowed rapid, nondestructive
History of Development of Specialities | 365

dissection of the cochlea. Hard work of many


years resulted in the Traveling Wave theory
of hearing. For this work, Békésy was
awarded Nobel Prize in Medicine, 1961.
After obtaining a PhD in physics, Békésy
became a communication engineer in the
Budapest Telephone office. Early in his
career, a visitor casually asked him whether
any significant improvement can be expected
in the telephone system in near future. The
remark led Békésy to a more fundamental
question: “How much better is the quality
Fig. 6.14: Békésy
of human ear than the telephone system?”
Soon, Békésy became a nuisance in the autopsy room of the local hospitals
as well as the mechanical workshop of the Post office. He used to obtain
the skull bone with the inner ear from the cadavers and then devised
methods to remove the tiny cochlea intact. Working under a microscope
with “microtools” of his own invention, he drilled tiny holes in the
cochlea. He replaced the cochlear fluid with saline solution containing
powdered aluminum and coal. By exposing the cochlea to flashes of
intense light, he could observe the movements of the basilar membrane
in response to sounds of various frequencies. Békésy observed that the
basilar membrane did not vibrate only at a specific point in response to
a particular sound frequency, as proposed by Helmholtz. Instead he
found that on exposure to sound of any frequency, a wave of movement
sweeps along the entire basilar membrane. However the amplitude of
movement reaches its maximum in a particular part of the basilar
membrane. High frequency sounds produced maximum amplitude in the
lower part of the cochlea, whereas low frequency sounds produced
greatest amplitude towards the apex. This theory, called the Traveling
Wave theory was essentially a refinement of the resonance theory or
366 | History of Medicine

place principle proposed by Helmholtz about 100 years earlier. The real
achievement of Békésy was the innovative techniques employed to
produce the experimental evidence.
No mammalian ear, from mouse to elephant, escaped the scrutiny
of Békésy. Once he read a newspaper report about the death of elephant
had died in the Budapest zoo. In no time he headed for the zoo, but the
carcass had already been sent to Budapest University. The biology
department, after some dissection, had sent the entire carcass to a glue
factory. Finally he succeeded in obtaining the skull bones of the elephant
from the glue factory. “Traveling wave” phenomenon could be
demonstrated in the elephantine cochleas.

HISTORY OF ORTHOPEDICS—
NICOLAS ANDRY “THE BIRTH OF ORTHOPEDICS”

Nicolas Andry (1658-1742) was a French physician. He was an eccentric


and remained controversial throughout his career. For example, the title
of his thesis for MD degree was, “The relationship in the management
of disease between happiness of the doctor and the obedience of the
patient!” As a dean of the medical faculty, College de France, he was
constantly creating trouble for his colleagues, particularly the barber-
surgeons. He forbade the barber surgeons from operating if a doctor
(physician) was not present. In one of the numerous books written by
him he attributed all illnesses to the presence of worms in the intestine
and therefore recommended strong purgatives for every ailment. The
title of another book was, “On the preeminence of medicine over surgery.”
It is ironic that today he is remembered as a man who coined the term
orthopedics, a surgical specialty.
At the age of 80, only two years before his death, Andry wrote a
book called Orthopedics: or the art of Correcting and Preventing
Deformities in Children. The word orthopedics has Greek root and
History of Development of Specialities | 367

means “straight child.” His book gave picture


of a crooked tree, which came to be used as a
symbol of orthopedics (Fig. 6.15). The
picture was not on the cover page of the book.
It was used to illustrate his method for
correcting excessive curvature of a child’s leg.
He suggested “applying as soon as possible
a small plate of iron on the hollow side of the
leg, and fasten it about the leg with a linen
roller.” According to him, “same method must
be used for recovering the normal shape of
the leg, as used for making straight the
crooked trunk of a young tree.” Andry
believed that crooked legs in young children
were the result of allowing them to walk too
soon. The book was an instant success and
Fig. 6.15: Tree of Andry
was translated in many languages. However,
because of the unscientific remedies suggested by him, Andry’s only
contribution to the subject remains the name of the specialty, orthopedics,
given by him.

PERCIVAL POTT
Percival Pott (1714-1788) is known in the field of orthopedics for the
detailed description of the ankle fracture, named after him. Pott’s most
famous work was on paraplegia caused by spinal tuberculosis (Pott’s
paraplegia).
Pott was a British barber-surgeon. When his father died, Pott was
only four years old. His father left him a patrimony of five pounds
sterling. This amount was found after Pott’s death among his affects in
a tin box. At the age of sixteen, he became an apprentice of a
368 | History of Medicine

barber-surgeon, Mr Nourse. After seven years of apprenticeship, the


Company of Barber- Surgeons, London awarded him a diploma to
practice surgery independently.
During his long career as a barber-surgeon, Pott introduced many
improvements into the art of surgery. In those times cautery was used
so frequently that it was always kept heated and ready by the barber-
surgeon. Pott used it only sparingly. When Pott was 43 years old, he
was accidentally thrown from his horse leading to a compound fracture
of the bones near the ankle. (Such fracture is even now named as Pott’s
fracture). He refused to be moved from the site of accident and remained
on the pavement in severe cold till a door nailed to two poles was
purchased for him. Thus he was carried to a barber-surgeon who got
ready to do the amputation of the leg. Pott was convinced that a
conservative treatment would heal the fracture. He found support in
such treatment from his mentor Mr. Nourse. The leg was saved from
amputation, but Pott had to remain in the bed for many months. It was
during this period of forced rest that Pott turned to writing. Subsequently,
Pott wrote many books that revolutionized the practice of surgery in
London. Pott is also remembered for the first description of the paraplegia
resulting from tuberculosis of the spine, in a 1782 publication “The
useless State of Lower Limbs in Consequence of a Curvature of the
Spine.”
In 1775, he published his observations on the high rate of scrotal
cancer in Chimney sweeps, postulating a link between soot and cancer.
In those days, the chimney sweeps used to climb naked into the chimney
to clean it. Thus their bodies were in repeated contact with soot. This
was the first report where the possible cause of a cancer was described.
This report led to a change in the working conditions of chimney sweeps.
Pott’s kindness of heart towards his patients as well as his colleagues
was proverbial. At one time, three barber-surgeons were living in his
house because they could not earn a living from surgical work. The poor
History of Development of Specialities | 369

financial condition of the barber-surgeons in those days is reflected in


the statement: “Surgery is a profession which procures its members
bread only when they have no teeth to eat it.”
Jean Andreas Venel (1740-1791), a Swiss physician, is considered
the first true orthopedist. He established the world’s first orthopedics
institute solely concerned with the treatment of childhood skeletal
deformities. His work is specially appreciated because he recorded and
published all the methods of treatment used by him.

ANTONIUS MATHYJSEN
Antonius Mathyjsen (1805-1878), a Dutch military surgeon is credited
with the invention of the Plaster of Paris bandage, that remains the
mainstay of fracture immobilization to this day.
Plaster (gypsum) has been used since the time of Roman Empire to
decorate houses and temples. The name “Plaster of Paris” originated in
18th century when the King of France decreed that the walls of all the
(wooden) houses in Paris must be covered with a layer of plaster
(gypsum) so as to render them fire-proof. This decree was the result of
the Great Fire of London 1666, which literally burnt the entire city.
Large gypsum deposits found near Paris made
the plaster of Paris famous all over the world.
Fractures and their treatment find
mention in the works of Hippocrates and
Sushruta (Fig. 6.16). The earliest method of
treatment of fractures involved using splints
made of wood or bamboo or cloth bandages
stiffened with starch or lime. The problem
was the fragile nature of the splint or the
cast. In the ancient times, the compound Fig. 6.16: Ancient
fractures were invariably fatal. On the treatment of spine injury
370 | History of Medicine

treatment of such fractures, Hippocrates gave the following advice:


“One should try to escape from such cases provided one can do so
honorably; the hope of recovery is small and dangers many. If the
physician does not reduce the fractured bone, he would be looked upon
as unskillful, while reducing fractured bone shall bring nearer to death
than recovery.” In the medieval period, the medical textbooks describe
illustrations of various types of splints, some of which are comparable
to those used these days. However all this methods required complete
rest for the patient. Ambulatory treatment of fractures became possible
only in 1852, when Antonius Mathyjsen, a Dutch army surgeon
introduced the roller bandages impregnated with quick drying plaster of
Paris. With this method, broken bones could be held in place while wet
bandages were applied. When dry, the bandages became a rigid cast that
totally immobilized the broken bones till natural healing took place.

HUGH OWEN THOMAS


Hugh Owen Thomas (1834-1891) (Fig. 6.17)
is often remembered as the father of British
orthopedics. It is said that if you are allowed
to read about one person in the history of
orthopedics, then read about Thomas. His
father was a well-known bone setter, most
disliked by the qualified medical fraternity of
his time. He was arrested a number of times
for practicing medicine without a degree. As a
result, he forced all his five sons, including
Thomas to qualify in medicine. In spite of his
medical qualification, because of bad Fig. 6.17: Thomas
temperament, Thomas could neither work with his father nor in any
government hospital. He treated all his patients at his home. He had
History of Development of Specialities | 371

such a busy practice that he started examining patients at 5 or 6 AM,


and never left home except on a professional purpose. On Sundays, he
did not charge the patients, and hence hundreds of patients would
surround his house for the free treatment. Hugh Thomas believed the
fracture is best treated by a prolonged and strict rest to the concerned
bone. For this purpose, Thomas devised a number of splints named
after him. However, his work was not recognized in his life-time. The
value of these splints was shown by his nephew Sir Robert Jones during
the World War I, when the use of such splints reduced the mortality of
patients with compound fractures from 80 percent to less than 8 percent.
The beginning of the 20th century came as a great turning point in
the history of orthopedics. With the discovery of X-rays, bone fractures
and other diseases came to be recognized as never before. During the
World War I, the important role of orthopedic surgery began to be
recognized. In 1918, 12 surgeons founded the British Orthopedics
Association, to the great resentment of the general surgeons. In 1918,
the Royal College of Surgeons in England expressed “mistrust and
disapproval in the movement in progress to remove the treatment of
conditions, always properly regarded as the main portion of the General
Surgeon’s work, and place it in the hands of orthopedic specialists.”

ROBERT JONES AND WATSON-JONES


Sir Robert Jones (1855-1933) was a nephew and student of Sir Hugh
Owen Thomas. He understood the importance of X-ray as early as
1896, and published the first report of the clinical use of X-ray to locate
a bullet in a wrist. Robert Jones founded several orthopedic hospitals
and the Association of the Orthopedic Surgeons. His textbook
“Orthopedic Surgery” was the first book to deal systematically with
the diagnosis and treatment of fractures. During WW I, Robert Jones
headed the orthopedic section of the British Forces. He was knighted
372 | History of Medicine

for saving thousands of soldiers by use


of the splints invented by his uncle Sir
Thomas, three decades earlier.
Sir Reginald Watson-Jones (1902-
1972) (Fig. 6.18) was another famous
British orthopedic surgeon. His book
“Fractures and Joint Injuries” published
in 1940 remained a standard reference
for several decades.
In the treatment of fractures, he had
a life-long faith in conservative treatment
with plaster of Paris. He did not favor
Fig. 6.18: Watson-Jones
internal fixation unless it was absolutely
essential. According to him the cause of nonunion of a fracture is usually
inadequate immobilization. “Nonunion of a fracture is due to a failure of
the surgeon more than the failure of osteoblasts.” His attitude to the
treatment of a fracture was opposite to other orthopedic surgeons who
used to say “plaster means disaster,” and supported nailing procedure
and early mobilization. Once commenting on his conservative approach,
Watson-Jones described himself as a physician destined to act as a
surgeon.
Watson-Jones was a strict believer in bloodless surgery. He claimed
that he could arthrodese a hip without a stain of blood on the drapes. He
certainly did that but only because every blood-stained swab was
immediately picked up and discarded by the assistant. His clinical acumen
has been summarizes as follows: “He was always positive, ever in
doubt and by grace of God, always right.” Watson-Jones had a remarkable
ability of making each patient feel that he or she was the only one of
importance that day; a gift he probably acquired when working with Sir
Robert Jones.
Besides a busy practice, Watson-Jones was the editor of the British
edition of the prestigious journal “Bones and Joints Surgery.” In addition
History of Development of Specialities | 373

he was revising his book ‘Fractures and Joint Injuries’ at regular intervals.
All this work kept him busy all night and day as well. In the preface of
the fourth edition of the book he refers to this habit: “Some of the
happiest moments of my life are writing this book from midnight to
dawn.”

KUNTSCHER AND ILIZAROV


Gerhard Kuntscher (1900-1972), began the intramedullary nailing
procedure for the treatment of fractures of long bones of the lower
limbs. Kuntscher was a German orthopedic surgeon. By 1939, he had
devised the intramedullary nailing technique. During World War II, he
used the IM nails extensively on soldiers with fractures of the lower
limbs and made them “fighting fit” in a matter of days rather than 6
months or more required by other techniques. IM nails were kept a
closely guarded secret throughout WW II. Rest of the world came to
know about this method of treatment only when the prisoners of war
came back home, and X-rays revealed large steel nails in the bones. The
patients also described the rapid recovery as a result of new method.
Still it took another 20 years before IM nails became a standard method
of treatment in rest of the world.
Gavril Abramovich Ilizarov, a Russian surgeon devised an
alternative method of treating fractures, called the external fixation. Soon
after his graduation, he was sent, without much orthopedic training, to
look after injured Russian soldiers in Siberia, in 1950s. With no equipment,
Ilizarov was confronted with crippling conditions of unhealed and
infected fractures. With the help of a local bicycle shop, he devised ring
external fixators tensioned like the spokes of a bicycle wheel. With this
equipment, he achieved healing, realignment and lengthening of bones to
a degree unheard of elsewhere. During the last decade, using this technique,
Russian doctors have claimed to have increased the height of many
short-stature individuals up to 12 cm.
374 | History of Medicine

JOHN CHARNLEY
JOINT REPLACEMENT THERAPY

Use of an artificial hip implant was being tried since 1930s. Before the
advent of antibiotics, this carried a high-risk of failure because of infection.
Even the materials tried were soon worn out and the patient needed
another implant within couple of years. In 1960s, a Burmese orthopedic
surgeon, Dr San Baw (1922-1984) pioneered the use of ivory hip
prosthesis to replace ununited fractures of the neck of femur. The first
such implant was given to an 83 year old Burmese Buddhist nun. From
1960s to 1980s, Dr Baw used over 300 ivory hip replacements, with 88
percent success rate. Due to physical, mechanical, chemical, and biological
qualities of ivory prostheses, it was found that there was a better
biological bonding of ivory with the human tissues as compared to the
materials used earlier. In the rest of the world, artificial hip implants owe
much to the work of Sir John Charnley.
John Charnley (1911-1982) (Fig. 6.19)
was a British orthopedic surgeon who
designed artificial hip implant in 1970s.
Charnley’s implant consisted of three
parts—(i) a metal (originally stainless steel)
femoral component, (ii) an ultramolecular
weight polyethylene acetabular component,
both fixed to the bone using (iii) special bone
cement. During the last decade, several
improvements have been made in the total
hip replacement procedure and prosthesis.
Charnley was trained as a general Fig. 6.19: Charnley
surgeon in Great Britain. In 1939, when WW
II broke out, Charnley volunteered his services to the British army.
After a short stint in a army hospital in Ireland, Charnley found himself
History of Development of Specialities | 375

posted to the orthopedic unit of an army hospital in Cairo. He was


assigned to head the orthopedic workshop, a component of the Base
Ordnance Workshop. Thus he came in contact with a number of skilled
mechanical and electrical engineers of the workshop. These contacts
helped him to develop an inventive skill that led to the invention of
walking caliper, a modified Thomas splint, and many other surgical
instruments.
After the end of WW II, Charnley continued to work in orthopedic
department of a hospital in Manchester, where he developed a hip
center in 1949. The hip center was to become the focus of his career in
the remaining part of his life. Search began for a material to constitute an
artificial cup (acetabulum) which can be inserted into the hip bone and a
femoral head made of steel, to act as an artificial ball and socket joint.
Such implants had been devised earlier but the friction between the two
components led to fast wear and tear of the implant that made its life-
span very short. By 1956, Charnley had selected a self-lubricating
synthetic material, polytetrafluoroethylene (PTFE). In vitro this material
gave excellent results. Within a few months about 300 patients had been
given PTFE prostheses. Soon it became apparent that in vivo, PTFE
showed a high wear rate (about 0.5 mm per month). Still worse was the
problem that the wear and tear debris of PTFE seemed to produce
intense inflammatory reaction. To verify the second problem, Charnley
got finely ground PTFE powder injected into his own thigh. Within
days, the severe inflammatory reaction was apparent. The PTFE
artificial hip implants had been launched with such a fanfare that Charnley
found acceptance of the folly of PTFE “like a monk pouring ashes over
his own head.” Still, he did not lose heart and started the search for
another suitable material. By early 1970s he started using ultra high
molecular weight polyethylene (UHMWPE) for the acetabular
component of the implant.
376 | History of Medicine

Charnley surgical skill was remarkable. He could complete the hip


implant operation 2-3 times faster than his colleagues. He also insisted
on personal training of each colleague joining his department. According
to him, an orthopedic surgeon must possess faculties to adapt to a wide
compass; the delicacy of a neurosurgeon, required in nerve and tendon
surgery; accuracy of a sculptor wielding the osteotome and heavy mallut;
the engineering skill of a fitter in using precision tools in bone grafting
and internal fixation; the indefinable art of closed reduction of a bone
setter; pleasure in perfect dissection under a tourniquet and satisfaction
in the carnage of hindquarter amputation.
These days, hip or knee replacement therapy has become very
common. In the USA, about 150,000 hip implants and about 130,000
knee implants are given every year. A success rate of over 90 percent
can be achieved in good hands.
Arthroscopic surgery is one of the latest techniques in orthopedic
surgery. It has been considered the biggest advancement in orthopedics
in the 20th century. Total joint replacement is no doubt more dramatic
but the reduced perioperative morbidity and faster rehabilitation make
arthroscopic surgery almost miraculous. The idea of endoscopic
examination of joints came to the mind of a Japanese surgeon, Dr Kenji
Takagi, who examined the knee of a cadaver using a cystoscope.
Ultimately it was Dr Masaki Watanabe, a student of Dr Takagi, who
succeeded in devising the first arthroscope in 1950s and published an
Atlas for Arthroscopy. Arthroscopic surgery became a reality in 1980s.
It is particularly important for the injured athletes for cartilage surgery
or reconstruction of torn ligaments. The procedure requires only a few
hours of hospitalization instead of weeks’ hospitalization required for
previously used open-joint surgery. Now more than 600,000 arthroscopic
procedures are performed each year in USA alone.
History of Development of Specialities | 377

PK SETHI
Jaipur Foot is the name given to a low-
cost prosthetic foot developed in 1960s
by Dr Pramod Karan Sethi (Fig. 6.20),
an Indian orthopedic surgeon with a FRCS
degree, in collaboration with a fourth-
standard passed artisan, Ram Chander
Sharma. In those days, Dr Sethi, head of
orthopedic department, Sawai Man Singh
Hospital in Jaipur, was using Western-
designed artificial limbs which were not
only very expensive and outside the reach
of a common man, but also unsuitable for
Fig. 6.20: Sethi
Indian way of life. “Most of us sit, eat,
sleep and worship on the floor, without shoes,” says Dr Sethi. He asked
for the help of Ram Chander Sharma, a well-known artisan of Jaipur to
devise an artificial foot suitable to the Indian conditions. Sharma is a
fifth-generation master craftsman who, in those days, made a living
making life-like sculptors for temples of Jaipur. Sethi and Sharma used
various types of materials, like willow, sponges, and aluminum molds,
but the resulting prosthesis was either too fragile of too unwieldy.
One day, while riding a bicycle to the hospital, Sharma got a tyre
punctured. For repair, he walked to a nearby shop, where he saw a truck
tyre being retreaded with vulcanized rubber. Once the bicycle was fixed,
Sharma raced to the hospital and talked to Dr Sethi. Soon after, both
returned to the tyre-repair shop, with a request to prepare a rubber foot
using a foot cast and the vulcanized rubber. Within a few days, a prosthetic
foot was ready. Using it, an amputee could squat, ride a bicycle or even
climb a tree. More importantly, the Jaipur foot costs less than 30 dollars
as compared to 8000 dollars cost of US-made prosthesis. In 1971, Sethi
378 | History of Medicine

presented the invention to the British orthopedic surgeons at Oxford,


who were impressed by the artificial limb’s suppleness and durability.
However, resistance of other Indian orthopedic surgeons did not allow
its widespread use in India. The prosthesis gained international
recognition only when Russian invasion of Afghanistan in late 1970s
and 1980s resulted in thousands of amputees due to land-mine injuries.
The Western aid agencies used thousands of Jaipur foot prosthesis in
Afghanistan and later in many other countries. At present, in India,
about 72,000 people are using Jaipur foot.
For the invention of Jaipur foot, Sethi received numerous national
awards including Padam Shri in 1981. The same year Dr Sethi received
the prestigious Raman Magsaysay Award, which includes sizeable
money. Some people are critical of Sethi, who did not care to share the
award money with his equally important collaborator, Ram Chander
Sharma. However, Sharma shows no bitterness over Sethi’s fame and
riches. These days, Sethi is leading a retired life, but Sharma still helps
an organization making Jaipur foot and continues to design other types
of artificial limbs for the poor.

HISTORY OF OBSTETRICS AND


GYNECOLOGY—CHILDBIRTH
Childbirth is seldom easy. Before the development of the modern
medicine, it was dangerous, too. About one in two hundred deliveries
ended in maternal mortality. Till the middle of 17th century, childbirth
was conducted by a midwife or a female relative. Physicians, always
males, were not allowed to witness childbirth (Fig. 6.21). For centuries,
the midwives were self-trained mostly by hand on experience. The first
school of midwifery was established in Paris by a barber-surgeon,
Ambroise Pare, in middle of 16th century. Pare has described the
podalic version breech delivery and caesarian section. The caesarian
History of Development of Specialities | 379

section was performed mostly when the


mother had died during childbirth. Male
midwives (“accoucheurs”) became
fashionable in France by the middle of 17th
century. The best known of the French
“accoucheurs” was Francois Mauriceau
(1637-1709), who is remembered by
today’s obstetricians by the “Mauriceau-
Smellie-Veit maneuver” for dealing with
after coming head in a breech delivery. In
1668, Mauriceau published his celebrated
Fig. 6.21: Clinical dia-
textbook on the midwifery, which was
gnosis of pregnancy:
translated into several languages and went 19th century England
through several editions. He was indeed an
innovator. He pioneered primary suturing of the perineum after
delivery—“cleansing with red wine and then applying three or four
stitches.” Mauriceau also introduced the practice of delivering women
in bed rather than on a stool, as done earlier.
Man-midwifery reached Britain in the 17th century, but remained
less fashionable than in France. The most famous British obstetricians
and barber-surgeons of 17th century were three generations of
Chamberlen family. They are remembered for the invention of the
obstetrical forceps. The family was of French origin that fled France in
1569 to escape prosecution and settled in Britain. Peter Chamberlen
the elder is believed to have invented the obstetrics forceps, a two
bladed instrument. The instrument was kept as a closely guarded secret.
For home deliveries, it would be carried in an enormous box and always
used it under cover of a bed sheet or a blanket. Peter the elder, his son,
Peter Chamberlen junior and grandson Hugh Chamberlen, all
obstetricians, maintained the secrecy of the obstetrical forceps for more
than half a century. With this forceps, Chamberlens could deliver babies
380 | History of Medicine

even in difficult labors that no one else could. That is why Chamberlens
became famous all over Europe, especially when Peter the elder became
surgeon to the Queen, and attended the wives of both King James I and
Charles I in childbirth. Hugh Chamberlen tried to sell the secret forceps
for a large sum of money. He was asked to demonstrate use of the
instrument in a rachitic dwarf female in labor. Hugh failed to save the
baby as well as the mother and the deal fell through. The Chamberlen
forceps was finally discovered in 1813 in their family home in Essex.
Even after the design of Chamberlen forceps became public knowledge,
its use remained controversial. Initially, its use remained restricted to
some man-midwives who lived near the Chamberlens in Essex and a few
specialists. One such specialist was William Smellie, who led the way in
the 18th century in establishing obstetrics as an academic discipline in
Britain.
William Smellie (1697-1763) was a physician who studied
midwifery in the first British School of Midwifery, founded in 1725. In
1740s, he started his own course of midwifery. The course was for
duration of two years and cost 20 guineas. It was prominently advertised
that “The men and women are taught at different hours.” Among many
contributions of Smellie’s in obstetrics practice, one was the improvement
in obstetrical forceps. The original Chamberlen forceps had a cephalic
curve to fit the baby’s head but no pelvic curve characteristic of modern
obstetrical forceps. Smellie not only added the pelvic curve to the forceps
but also adopted the “English lock,” which allowed the two blades to be
inserted separately into the vagina and then brought together. Smellie
published his landmark Treatise on the Theory and Practice of Midwifery
in 1752.
William Hunter (1718-1783) graduated in medicine and joined the
practice of Smellie in London. At the age of 44, Hunter became consultant
to the Queen Charlotte. He is perhaps most famous for his Atlas of the
Gravid Uterus. Hunter exemplifies the development of British obstetrics
History of Development of Specialities | 381

in the late 18th century. Although, he knew about the use of forceps in
delivery, he took pride in using them rarely. He was the one of the first
to enter the field of normal labor, which had hitherto been the prerogative
of the female midwives. In case of the delivery of Princess Charlotte,
aged 21, the labor lasted 50 hours, but the attending obstetrician, Sir
Richard Croft, a pupil of William Hunter, did not use the forceps. The
baby was stillborn and the Princess herself died six hours after the
delivery. Richard could not tolerate the widespread criticism and shot
himself.
At the start of 19th century, childbirth was still dangerous to women
and remained so even in early part of 20th century. In England, about
one in 200 childbirth ended in death of the mother. Among the poor,
rachitic pelvis made delivery difficult, but postnatal maternal mortality
affected all social classes. In the maternity hospitals, the death rate was
often as high as eight per 100 deliveries. The chief cause was the “childbed
fever,” that later came to be known as puerperal fever.
Ignaz Philip Semmelweis (1818-1865) was a Hungarian physician,
and titular house officer in the First obstetrical clinic, in a large hospital
in Austria, under Professor Johann Klein. Semmelweis was baffled to
see 13 percent maternal mortality due to childbed fever in his ward as
compared to 2 percent mortality in the Second obstetrical unit of the
same hospital, though both units used similar methods.
The only difference he could observe was that whereas the first unit was
involved in teaching medical students, the second unit was training only
midwives. The medical students and doctors of the first unit would
often do an autopsy on a cadaver and next walk into the labor room and
conduct a delivery. Then, one of his friends, a pathologist, died after
having accidentally punctured his finger during a postmortem of a patient
of puerperal fever. Semmelweis conducted the postmortem on his friend’s
body himself. He was amazed to see pathological conditions similar to
those he had earlier seen in women dying from puerperal fever.
382 | History of Medicine

Semmelweis came to the conclusion that the doctors and medical students
carried “cadaveric contamination” to the patients in the labor room. He
devised a new protocol in 1847 that required every person to wash his
hand with chlorinated lime water before examining a patient in his
obstetrical unit. Washing of the hand before examining a patient was
considered a waste of time by most of the doctors of that unit. Moreover,
the doctors were not ready to admit that they themselves had been
carrying the “cadaveric contamination” to their patients and killed them.
Somehow, Semmelweis was able to enforce the new procedure in his
unit. Within a month the maternal mortality fell to 2 percent as was in
the Second obstetrics unit. The idea was not to the liking of his boss, Dr
Klein, and Semmelweis was asked to leave the hospital. That was the
time before the “germ theory” of disease was accepted. Thus, women
continued to die from puerperal fever for many decades, before the
cause of puerperal fever was finally identified by Louis Pasteur in 1879
as due to streptococcus, and the start of antisepsis era, by Joseph
Lister. By 1880s Listerian antisepsis was adopted by most of the British
and American lying-in (maternity) hospitals. Transition from antisepsis
to asepsis was far less controversial and fairly rapid.
James Young Simpson (1811-1870) is another famous name in
obstetrics. He became a professor of midwifery in Edinburgh in 1840
and physician to the Queen of Scotland in 1847. Simpson refined the
obstetrical forceps, producing a design that is being used today. He also
experimented with a vacuum extractor. Simpson is especially remembered
for the first time use of chloroform to treat the childbirth pains. Within
a few months of the discovery of ether as an anesthetic in the USA,
Simpson discovered the anesthetic property of chloroform. Within four
days of the discovery, Chloroform was used by Simpson on a patient
with a deformed pelvis in labor. The mother was so grateful that she
named the baby girl “Anesthesia.” Three weeks later, the results were
presented in a meeting of Medico-Chirurgical Society. Simpson met
History of Development of Specialities | 383

with strong opposition from doctors and the clergy, who quoted the
book of Genesis: “In sorrow thou shalt bring forth children.” Or, “ If
God had wished labor to be painless, He could have created so.”
According to the Scripture, childbirth pain originated when God
punished Eve and her descendants for Eve’s disobedience in the Garden
of Eden. All these arguments bit the dust, when in 1853, Queen Victoria
of England asked for the use of chloroform during labor and expressed
extreme happiness with the experience. After that, pain-free childbirth
became a common procedure, till the bad effects of chloroform came to
be recognized.
In early 20th century, a new European technique, called Twilight
Sleep became popular for eradication of childbirth pain. It consisted of
administration of morphine and a disorienting drug called scopolamine.
It was soon abandoned because pain was still significant unless too high
a dose was administered. Its only dubious value was that the patient did
not remember the pain or any event of the childbirth.
By the middle of 20th century, the safe and effective epidural
anesthesia came to be commonly used to ease the labor pains. However,
in the USA, 1960s and 1970s saw a movement in favor of nonmedicalized
natural childbirth.
Due to all the advances in the maternal health care before delivery
and during parturition, the maternal mortality rate fell significantly in
the developed countries. Unfortunately, the same is not true about
developing countries like India, Nepal, and Bangladesh where maternal
mortality rates continue to remain high.

CESAREAN SECTION
The origin of the word cesarean is doubtful. The word is rumored to
have originated from Julius Caesar, who is said to be the first infant born
live by this method. His mother survived and had four more children
384 | History of Medicine

born normally. This account is doubtful, because in those days, such a


surgery was known to produce 100 percent mortality in the mother.
According to Greek mythology, Apollo removed Asclepius from his
dead mother’s abdomen. In Roman law under Julius Caesar, it was
decreed that all women who died during childbirth must be cut open to
remove the baby, alive or dead. Hence, this type of delivery came to be
known as Cesarean operation and later Cesarean section. However,
numerous references to such a surgery appear in ancient Hindu, Egyptian,
Greek, Roman, and other European folklore.
Until late 19th century, the procedure was performed without any
anesthetic. Four or five men would hold the woman down on a table
(often on a kitchen table), while the surgeon would cut her abdomen
with a dirty kitchen or pocket-knife or a razor blade (Fig. 6.22). The
status of the surgeon was judged by the amount of blood or pus sticking
on his surgical gown and the speed with which he performed the operation.
The surgeon would cut through the abdominal wall and through the
uterus. After the baby had been extracted, he would sew the skin incision,
leaving uterine incision open, under the hope that uterus would contract

Fig. 6.22: Ancient cesarean section


History of Development of Specialities | 385

and close the gap. Almost invariably, the woman would bleed to death
soon after or succumb to overwhelming infection within a few days. In
view of the exceedingly poor outcome, the procedure was usually
performed after the death of the mother or when she was at death’s door.
The only purpose was the extraction of the infant, alive if possible or
even if dead. Christianity dictated that a pregnant mother cannot be
buried before the fetus has been removed. The fetus was to be separately
buried to “save its soul.” According to one report, in Europe, not a single
woman survived a cesarean section between 1787 and 1876.
The discovery of anesthesia in later part of 19th century, gave more
time to the surgeon to perform Cesarean section more carefully, but still
the postoperative infection remained very common. By the middle of
20th century, the availability of safe anesthesia, and antibiotics made
Cesarean section a safe procedure. The most common indication of
Cesarean section till now had been the deformed pelvis because of
childhood rickets. In developed countries, better nutrition drastically
reduced the incidence of all deficiency disorders, including rickets.
However the use of Cesarean section continued to grow. Cesarean section
accounts for 20 percent deliveries in the UK and 30 percent in the USA.
Some celebrities and rich women ask for Cesarean section, without any
medical indication, because they are not willing to suffer the childbirth
pains. “They are too posh to push,” as a wag commented. In India,
Cesarean section is sometimes performed so that the baby may be born
at a particular auspicious time determined by an astrologer.

ABORTION
Abortion has been practiced in almost all human communities from the
ancient times. Women faced with unwanted pregnancies have resorted
to abortion regardless of religious or legal sanction and often at
considerable risk to their lives. According to one anthropologist, abortion
386 | History of Medicine

may be considered a fundamental aspect of human behavior. Greek and


Roman civilizations considered abortion and infanticide as an integral
part of maintaining a stable population, and no social stigma was attached
to it. Hippocrates, the Greek physician whose famous oath forbade the
use of pessaries to induce abortion, nonetheless, writes of having advised
a prostitute who fell pregnant to jump up and down, touching her
buttocks with her heal at each leap, so as to induce miscarriage.
Early Christians condemned abortion, but did not consider
termination of pregnancy as abortion before “ensoulment”, or
“animation,” the time of quickening, when the fetal movements begin to
be felt by the mother. By the end of 19th century, it was clear that life
begins at conception and therefore the Church banned abortion at any
stage of pregnancy. In 1850s, two million pregnancies were aborted
every year in the United States. In 1950s, when abortions were banned,
still about one million illegal abortions were performed every year and
about one thousand women died per year as a result of complications
following an abortion. In 1970, abortion was legalized in the USA.
These days, about 1.3 million abortions are induced every year. In other
words, whether considered legal or illegal, the incidence of abortions has
not changed. In India, about 4 million pregnancies are illegally aborted
every year, in spite of abortion being legal. In some parts of North India,
particularly Rajasthan, female infanticide has been performed since ages.
Recently this scourge has spread to other nearby states, particularly
Punjab and Haryana.
The materials used to induce abortion have varied from fasting,
blood-letting, poisonous herbs, exposure of the female to wild animals,
jumping up and down, heavy exercise, heavy swimming, toxic pessaries,
long needles, hooks, knives, sitting over a pot of steam, stewed onions
or a heated coconut, ergot, and suction apparatus to recent surgical
technique like curettage or more recently administration of prosta-
glandins.
History of Development of Specialities | 387

Abortions have not always been a hush-hush affair. In 1840s


Madame Restell was a well-known abortionist in the USA, with roaring
business in many big cities. Her annual expenditure on advertisement
alone was 60,000 Dollars. She offered a safe cure of “complaints
incidental to female frame” and claimed to resolve “all cases of suppression
or stoppage of menstruation, however obdurate.” Arrested a number of
times, she ultimately committed suicide.

CONTRACEPTIVES
Contraception is as old as mankind. Whether rich or poor, old or young,
in all culture women want to avoid unwanted pregnancies and to enjoy
their sexuality without obstacles. Besides this, another reason applicable
since ancient times has been how to protect the limited resources from
over- population. A rich man did not want fragmentation of the property
among a hoard of offspring.
The earliest mention of contraception is found in the Bible in the
form of coitus interruptus, a method still in use but with doubtful
efficacy. Coitus interruptus was popular at the time of Mohammad and
Islamic culture endorses the practice. Still earlier on, a record of 1850
BC in Egypt mentions the use of a mixture of honey and salt to irrigate
the vagina before sexual intercourse, (scientifically correct: The sperm
cannot swim in the highly viscous honey!). However one may not
endorse the practice of using crocodile dung pessaries or some gums
advocated by some Egyptians of that era. Aristotle (350 BC) wrote that
homosexuality was officially condoned as a measure of population control.
Around that time, the anal sex was considered a favor to the females
since it would not result in a pregnancy.
Around the middle ages, the Royalty and men of standing would
have sex with their wives only for procreation. It usually resulted in a
rapid succession of births. After this, they would spare their wives and
turn to prostitutes for recreation.
388 | History of Medicine

Ancient Egyptians and Romans seem to have used condoms made


from animal intestine. They were expensive but could be washed and
reused. The origin of the word condom is controversial. Legend is that
the condom was the creation of one Dr Condom, during the reign of
King Charles II (1660-1685). The king was alarmed at the number of
illegitimate children he had. He had 13 of them. It is not clear whether
this number was before the creation of condom by his court doctor, or in
spite of its use. In the 17th century Europe, condoms were chiefly used
as a protection against sexually transmitted diseases, especially, against
the most dreaded, syphilis.
In 1850, the use of condoms became more popular with the
manufacture of vulcanized rubber condoms by the famous Goodyear
tire company. Interestingly, such condoms were nicknamed “French
letter” in England, and “La capote angalise” in France. During World
War I, the American troops were not officially allowed to use condoms.
At that time the American public believed that “loose sexual behavior”
deserved the effect of disease. As a result, thousands of illegitimate
children were left behind when the American troops went home after the
war was over. Be the time of World War II, better sense prevailed and the
American troops were encouraged to keep condoms as a part of their
emergency kit. A training film urged, “Don’t forget—put it on before
you put in.” Now as a measure of protection against the spread of
AIDS, the use of condoms is all time high in the world.

Vasectomy and IUCDs


As early as 1884, experimental work on dogs and rabbits revealed no ill
effects of vasectomy on the testicular function. In 1907, compulsory
vasectomy was started in the USA in confirmed criminals, idiots, rapists,
and imbeciles as a part of eugenics. During the next 4-6 decades,
sterilization of people with “undesirable” traits was widely performed
History of Development of Specialities | 389

throughout USA and Europe. Germany was most notorious for this
campaign. During 1933 to 1945, 400,000 males were vasectomized
under “Fitness to Marry” program. In 1960s and 1970s, many
governments in Asia adopted voluntary vasectomy as a part of national
family program. Due to the imaginary fear of losing manhood, vasectomy
could not become very popular in India. Instead, the Indian males prefer
their wives to undergo tubectomy.
It is believed that Arabian and Turkish traders inserted small stones
into the uterus of their camels in order to prevent pregnancy during long
travels in the desert. In the middle ages, wooden blocks were used as
vaginal pessaries. In 1970, T-shaped IUDs were quite popular in many
parts of the world. Due the risk of infection of the genital tract and
cheap availability of oral contraceptive pills, IUCDs have now become
mostly redundant.

Oral Contraceptives
Margaret Sanger (1879-1966) (Fig. 6.23)
was the most vocal exponent of family
planning and contraceptives. She carried
on the campaign for over five decades, in
spite of arrests many a time. Ultimately,
she succeeded in abolishing a century old
official ban on the sale of contraceptives
in the USA as well as the development of
the first oral contraceptive, named as “the
Pill.”
Sanger’s commitment to birth control
arose from a personal tragedy. She was Fig. 6.23: Margaret Sanger
one of the eleven brothers and sisters her mother gave birth to. In
addition, her mother suffered seven miscarriages. For a poor working
390 | History of Medicine

class family, this burden of repeated pregnancies was too much of a load
on the mother. At the age of 19, Sanger saw her 50-year-old mother
dying of tuberculosis. Facing her father over her mother’s coffin, Margaret
lashed out, “You caused this. My mother is dead from having too many
children.”
Margaret was trained as a nurse. She started her clinic in a part of
New York City inhabited mostly poor Black families. Lacking proper
contraceptives, many women, when faced with another unwanted
pregnancy, resorted to illegal five-dollar back-street abortion. When the
abortion got complicated, Sanger was called in. Seeing hundreds of such
mothers, Sanger was convinced about the need of an effective
contraceptive. She coined the term “family planning.” In the beginning,
her idea of family planning was related to eugenics—poor Black women
should have fewer children, whereas “intelligent” and rich Whites should
have still more. Over the years, she felt that birth control over pregnancy
was the right of every female, whether Black or White. However a law
promulgated in 1872 prohibited the dissemination of information about
contraceptives or their sale.
To begin with, Sanger, working with volunteer friends, started
publication of a magazine, the Woman Rebel in 1914. The first issue
stated the aim of the magazine was to “stimulate the women to think for
themselves and to build a conscious fighting character.” The magazine
started giving information about various contraceptive measures available
at that time, especially the condom and the diaphragm. Dissemination
of such information was not only illegal, but was also bitterly opposed
by Roman Catholic Church. She was charged with distribution of obscene
material, which carried the maximum sentence of 45 years in prison.
Sanger fled to London, leaving her husband and children. She came back
after two years, when charges against her were dropped. In order to win
over public opinion, Sanger founded the National Birth Control League
as well as the first Birth Control Clinic in 1916. The clinic became
History of Development of Specialities | 391

overwhelmingly popular, but as expected, within a few weeks, the


police conducted a raid and shut it down. Sanger and two other nurses
working there were arrested. In protest, hundreds of women came to the
streets and chased the police. Sanger was imprisoned for 30 days. Back
home from the prison, Sanger started the family planning clinic once
again and started a nationwide tour to propagate the philosophy of birth
control and to popularize the use of contraceptives. By 1950, Sanger
had brush with the law many a time, but was frustrated by the lack of
fertility control in the hand of the women. The use of condom was
determined by the male. She wanted a “magic pill” which could be taken
by a female unknown to any body else. With this pill, she would get
pregnant only when she chooses to be. But medical researchers were not
interested. Her search ended in 1951 when she met Gregory Pincus, a
medical expert in human reproduction, who was willing to take up the
project. They also found Katharine McCormick, a rich lady willing to
finance the project. In 1960, Enovid, the first oral contraceptive was
launched. The final victory came in 1965 when the Supreme Court of
the USA declared that the use of contraceptives was a fundamental right
of an American citizen.
Because the “pill” was so effective and its use, particularly in the
USA, became so widespread that it became a symbol of women’s
liberation. It opened the door to a psychological mind-set of a life
beyond having kids and being a house wife. It gave women a freedom
never seen before. However, soon a great debate started on the
consequences of growing premarital sex and promiscuity on the social
norms of the American society. In spite of the entire hullabaloo, 40
years later, oral contraceptive pills constitute the most common
nonprescription medicine sold in the US. In other parts of the world, the
pill is gaining popularity as the most simple and effective contraceptive.
392 | History of Medicine

GYNECOLOGY
Ephraim McDowell (1771-1830) is remembered as the father of
Gynecology. He is credited with the first successful ovariotomy in
1809. McDowell spent two years at a medical school in Edinburgh but
never graduated in medicine there or elsewhere. Even then, by 1909, he
was one of the most highly regarded surgeons of Kentucky, USA. He
was awarded an Honorary MD degree by the University of Maryland
in 1825.
The operation of ovariotomy was performed on a Black woman.
The ovarian tumor was so big that the pendulous abdomen reached
almost to her knees. Few days before the operation, the patient traveled
60 miles on a horseback, resting the abdomen on the saddle. In those
days, no one ever opened the abdomen for fear of almost inevitable
postoperative infection. It was a preanesthetic era. Save some narcotics
like opium or alcohol, there was no other means of pain control during
surgery. Several attendants were used to restrain the patient during
surgery. The operation took 30 minutes. An ovarian tumor weighing 22
pounds was taken out. There were no postoperative complications and
the patient returned home in 25 days. It is interesting to recall that
people in the town had gathered around the house of McDowell, with a
noose to hang him if failed in his “butchery.”
The absence of postoperative infection has been attributed to the
fact that McDowell used to perform surgeries at his home and the
sheets used on the patients were invariably boiled to remove the blood
stains. At that time, even the most advanced surgeons of Europe were
afraid of abdominal surgery. The report of a successful ovariotomy by
an untrained backwoods American physician was at first disparaged by
the British surgeons. McDowell’s claim of the remarkable surgery was
accepted when he repeated the success 12 more operations on ovarian
tumors.
History of Development of Specialities | 393

James Marion Sims (1813-1883)


(Fig. 6.24) opened in New York the
world’s first hospital devoted to diseases
of women in 1855. He is remembered for
the development of the surgical treatment
of vesicovaginal fistula, Sims speculum
and the Sims position for pelvic
examination.
Sims was not a bright student in the
medical school. But, after graduation, he
was considered a skilful surgeon. His
Fig. 6.24: Sims
interest in women’s diseases was aroused
when he examined a slave woman with a vesicovaginal fistula in 1845.
Within a few days; two more slave women with similar problem were
referred to him. In the literature, he could not find any surgical treatment
for the disorder. The basic problem was that the site of the fistula in the
vagina could not be visualized. For this purpose he devised what is now
known as Sims speculum. Using the same, he wrote, “I saw everything
as no man had ever seen before. The fistula was as plain as the nose on
a man’s face. I felt I was on the eve of one of the greatest discoveries of
the day.” Sims was well-known for his over-elaborate prose in his
publications. However it may be said to his credit that, although Sims
did give fascinating descriptions of his achievements, he did not mind
describing in detail his failures as well.
The three slave women with vesicovaginal fistula were subjected to
various surgeries without anesthesia but there was no success. Only
after four years and 30 operations, the vesicovaginal fistula could be
repaired in the first patient. Thus, the technique for the repair of
vesicovaginal fistula became known to the surgeons of the world. The
career of Sims took a turn and he decided to entirely devote himself to
the surgical treatment of vesicovaginal fistula. For this purpose he opened
394 | History of Medicine

a hospital for women’s diseases. Gradually, his fame reached Europe


and he was invited to operate on the rich and the famous, including the
wife of Napoleon III of France.
Some historians have criticized Sims’ experimentation on the poor
slave women without anesthesia. The fact is that anesthesia was not
being used by any of the surgeons of that time. He was trying to treat
the patients as best as he could. It is a matter of chance that the first
three women were slaves, who agreed for the repeated operations to
escape from the miserable condition they were in. As one doctor of that
era commented, “A sadder situation can hardly exist for a woman afflicted
with a vesicovaginal fistula. It was a source of disgust, even to herself.
The woman, earlier beloved of her husband becomes an object of bodily
revulsion to him. These women are rejected by their husbands and
sometimes by their families merely because their bodies suffered excessive
trauma during childbirth.” Even today more than two million women in
the world suffer from obstetrical fistulas per year, especially in Africa,
parts of Asia, and Papua New Guinea; areas where proper medical care
is not available for antenatal check-up and deliveries. The significance of
the Sims’ work can only be appreciated fully only after examining a few
of such patients.
Lawson Tait (1845-1899) was a pioneer in pelvic and abdominal
surgery in England. While McDowell had successfully performed
ovariotomy in 1809, in other hands the operation had a mortality of
over 90 percent.Tait improved the technique and meticulous cleanliness,
novel at that time, reduced the mortality of such operations to about 10
percent. Another advantage for Tait was the availability of anesthetic
agents Tait successfully performed thousands of ovariotomies. He was
instrumental in the opening of the Birmingham Hospital for Women,
where he worked for 20 years. He is also recognized for introducing the
surgeries for ectopic pregnancy, and tuboovarian abscess. He devised an
operation of total hysterectomy. Sims and Tait are considered fathers of
gynecology.
History of Development of Specialities | 395

BABY INCUBATORS
Prior to the invention of the incubators, care of the infants, including
premature infants was mainly the responsibility of the mother. The job
of the obstetrician was over with the
delivery of the baby. General physicians
seldom showed interest in the medical
care of the newborn. Consequently,
neonatal death rate was fairly high. In
case of premature babies, the death rate
was as high as 85 percent. Premature
infants were expected to die and hence
no one seemed to bother. Stephane
Tarnier (Fig. 6.25), a French physician,
was first to notice that hypothermia,
because of exposure to cold, was the
cause of death in most of the premature
babies. A visit to the chicken incubator Fig. 6.25: Tarnier
on display in a Paris zoo inspired him to
have a similar apparatus installed in his clinic in 1880. Tarnier first baby
incubator housed several infants, (on the pattern seen in the zoo), which
was warmed by an external heating source. Staphane Tarnier is
remembered as the first neonatologist of the world.
The invention of an incubator by Tarnier came at a time when
French politicians noticed with alarm the fall in country’s birth rate,
which in 1870 was half of rival Germany’s. They created a fear of
eventual “depopulation” of France. Therefore, the hospitals were asked
to take special care of the newborn. As a result, special wards “Hospital
for the weaklings” were created and attached to maternity hospitals. In
these wards, premature babies were admitted for treatment. However,
these hospitals could not save most of the babies, since the mothers had
no role in their care and seldom visited the hospital.
396 | History of Medicine

Some of the French physicians of that era, notably one Adolph


Pinard condemned the use of incubators for the treatment of premature
“weaklings.” A strong believer in eugenics, Pinard argued that “these
premature weaklings, if saved by medical treatment, would remain weak
and infirm throughout their life and as adults, give birth to unhealthy and
weak babies. Over the years, France would be populated by unhealthy
population.”
After the retirement of Tarnier, Pierre
Constant Budin (Fig. 6.26) became head
of the Maternity hospital in 1895. Budin
discouraged the previous practice of
admitting the premature babies apart from
their mothers. He insisted that the mother,
or a wet nurse, must remain with the baby
in the hospital. He developed incubators
with a glass cover through which the
attendant could keep minute to minute
watch on the baby. With this system, the
Fig. 6.26: Budin
survival rate of the premature babies rose
dramatically. Through the publication of his book “The Nursing”, in
1900, Budin was recognized as an international authority on the care of
the premature infants. His uncompromising insistence on breastfeeding,
and maternal involvement in the care of the newborn was a fore-runner
to the similar movement at present.
In late 1891, Alexander Lion, a French obstetrician, devised a more
refined baby incubator. It was a large metallic device equipped with a
thermostat, and a forced ventilator (Fig. 6.27). The equipment was very
expensive, but Lion, more like an entrepreneur than like a physician,
raised money by opening the incubators along with the premature infants
to public for a fee. Thousands of people flocked to his hospital to have
a look at tiny infants in the machines. Encouraged by the public response,
History of Development of Specialities | 397

Lion opened a Kinder-brutenstalt (Child


hatchery), an elaborate incubator baby show
in the Berlin Exposition, 1896, which
became the most talked about show of the
exhibition. By the year 1900, incubator
baby shows became a regular feature of
World Fairs.
Dr Martin Couney, who had worked
with Lion, brought the incubator baby shows
to the United States. Here, he found the
right atmosphere for his curious blend of
science and showmanship. With the help
of his daughter and an assistant, Couney Fig. 6.27: Lion incubator
rapidly built an empire of incubator
sideshows in city after city. At the shows, he employed “barkers” who
would shout at passersby and invite them to watch the show. At one
stage, Cary Grant, who later became a legendary Hollywood star, was
one of his barkers. At Coney Island in New York, Couney set up a
permanent incubator baby exhibition which ran continuously from 1904
to 1943. Certain people, especially childless women would visit the
incubator hospital exhibition more often than even the parents of the
infants admitted there. They would become attached to a particular
premature tiny infant and come week after week to see its growth.
When the baby was discharged, their attention would divert to another
newly admitted baby.
The idea of displaying infants at the exhibitions was a controversial
topic at that time. Societies for prevention of cruelty to children protested
vigorously. They claimed that Couney’s main aim was not to save the
babies, but to make money. Couney launched a publicity campaign,
pointing out that these exhibition hospitals treated free of cost premature
babies from any socioeconomic or racial background, whereas other
398 | History of Medicine

hospitals were so expensive that only the rich could afford to admit
their babies. Thus, Couney handled more premature babies than any
other physician in the country, and his rates of survival were enviable,
even by today’s standard. However, gradually people lost interest in
these shows, but Couney maintained his lavish and extravagant life
style. By early 1940s, Couney was broke and died in poverty in 1950.

WET NURSING AND ARTIFICIAL FEEDING


A wet nurse is a woman who breast-feeds some other woman’s baby
for money. (A dry nurse is a woman who takes care of an infant but does
not breast-feed). Wet nursing has been practiced since the time
immemorial. In the pre-Christian era, wives of Greek Kings and nobles
were expected to breast-feed only the eldest son, who inherited the
kingdom. Other children were left to the wet nurses. In one instance, a
second son of a king inherited the kingdom because he had been nursed
by his mother, while his elder brother had been wet nursed. In ancient
Rome and Greece, wet nursing was usually assigned to one of the lactating
slaves. The slave wet nurses got more respect in the household than
other slaves.
In 1472 AD, Paul Bellardus, an Italian physician wrote the first
pediatric text, which included a section on the qualities of a good wet
nurse. From 16th to 18th century, well-to do women in Europe and
North America did not nurse their infants for fear of losing their beauty.
Moreover, in order to have an hour-glass figure wore such tight
undergarments (corsets) that damaged the breast tissue and the nipple.
So, some of them could not breast-feed their infants even if they desired
to do so. Employing a wet nurse was also considered a sign of family’s
high status in the society.
The contraceptive effect of breastfeeding was known even in those
days. Since, infant mortality rate was very high; breastfeeding was
History of Development of Specialities | 399

avoided so as to have as many children as possible. It was not uncommon


for the noble women to have 12-18 pregnancies, whereas the women
who worked as wet nurses usually had about half a dozen children. In
addition, it was believed that a breastfeeding woman should not have
sexual relations during lactation lest it taints her milk. (Sexual needs of
the noble men and women, obviously, had greater importance than those
of the poor wet nurses and their husbands).
By the later part of 18th century, the detrimental effects of the wet
nursing came to the notice of physicians. Dr William Cadogan wrote
an essay on nursing and care of the baby from birth to three year age. He
reported that children who were beast-fed by the mother showed better
growth than those who were wet nursed. Moreover, he reported that
breastfeeding decreased the incidence of mastitis and breast engorgement.
Thus, he stressed that breastfeeding was beneficial both to the baby as
well as the mother.
Artificial Feeding. The practice of artificial feeding seems to be as
old as mankind. Baby-feeding vessels have been found in Egypt dating
about 2000 BC. A mother holding a very modern-looking milk bottle is
seen in a relief found in the ruins in Egypt dated 888 BC. Moreover clay
feeding vessels were found in the graves of infants dated 1-5 century AD
in Rome. However, by and large, breastfeeding has been practiced in
most of to the cultures, though the noble women preferred to employ
wet nurses instead of feeding their babies themselves. By the middle of
19th century, a number of physicians started looking for an alternative
to breast milk, because the wet nurses were found to be the source of
many diseases of the infants, particularly syphilis. Moreover, with the
industrialization and women’s employment in factories, mothers could
not leave the job to breast feed the infants. Initially, the cow’s or goat’s
milk was used as a substitute of breast milk.
The first formula milk was developed by Henri Nestlé in the
1860s in response to the high mortality rate among infants in Switzerland
400 | History of Medicine

in foundling homes (orphanages). It was a combination of cow’s milk


and cereals and was called Farine Lactee. But infant formula became
increasingly popular during the 20th century as advertising entered its
golden age. The baby boom that followed WW II, provided a market for
the expanding infant formula industry. Between the years of 1946 and
1956, the already decreasing incidence of breastfeeding was halved in
the United States, leaving only 25 percent of infants still being breast-
fed at the time of hospital discharge. During the 1960s, when birth rates
tapered off in the Europe and the USA, infant formula companies began
marketing campaigns in nonindustrialized countries. Unfortunately, in
the third-world countries, because of poor hygiene and contaminated
water, bottle-feeding led to a marked increase in mortality rates among
infants fed formula milk. Organized protests, the most famous of which
was the Nestlé boycott of 1977, called for an end to unethical marketing.
Since the 1980s, the US and many other countries have made increasing
breastfeeding rates a priority in improving the lifelong health of their
citizens.
402 | History of Medicine

INSULIN PUMP THERAPY


Insulin pump therapy (also known as Continuous Subcutaneous Insulin
Infusion Therapy) has recently become a popular option of treatment
of Type 1 diabetes mellitus. The prototype of insulin pump is shown in
Fig 7.1 where as the latest model of insulin pump is shown in Fig. 7.2.
The device is especially useful in the patients which usually require
multiple doses of insulin every day and still have poor glycemic control.
The pump provides a continuous infusion of low doses of insulin (basal
insulin) around the clock. There is an arrangement for infusion of higher
doses, e.g. just before intake of food. Because of the high cost (6,500 US
dollars initially and about 300 US Dollars every month) the pump has
gained popularity only in the affluent population of US and Europe,
where approximately 20 percent of Type 1 diabetics are using insulin
pumps.

Fig. 7.1: Insulin pump Fig. 7.2: Insulin pump


prototype latest model
Medical Marvels of 21st Century | 403

TELEMEDICINE
Telemedicine means the delivery of medical care at a distance.
Telemedicine may be as simple as two health professionals discussing a
case over a telephone or as complex as using satellite technology and
video-conferencing equipment to conduct a consultation between medical
specialists in different countries, or still more complex robot surgery.
In its early manifestations, African villagers used smoke signals to
warn people to stay away from the village with cases of serious disease.
In early 1900s, people living in remote areas in Australia used two-way
radios, powered by a dynamo driven by a set of bicycle pedals, to
communicate with the Royal flying doctor Service of Australia.
Nowadays, two types of telemedicine are in use. First, the store-
and-forward telemedicine involves acquiring medical data (X-ray picture,
ECG record, EEG record, blood reports, etc.) and then transmitting this
data to a doctor or a specialist at a convenient time for assessment. It
does not require the presence of both the parties at the same time.
Second type, synchronous telemedicine, involves the interaction between
two doctors via a video-conference in which the two can share the
clinical picture of the patient. For example, a tele-stethoscope allows the
consulting physician to hear the heart sounds or murmurs of a patient
present even in a different continent. Such a system allows all kinds of
consultation on almost all kinds of medical disorders. The most difficult
type of telemedicine is involved in the tele-robot-surgery.
Telemedicine is most beneficial for population living in remote
regions, where a specialist is hard to find. It can also be a useful link
between a general physician and specialists available elsewhere.
404 | History of Medicine

ROBOTIC SURGERY
Robotic surgery means the use of a robot in performing surgery. Three
major advances aided by surgical robots have been remote surgery (tele-
surgery), minimal invasive surgery, and unmanned surgery. Major
potential advantages of robot surgery are precision and miniaturization.
Some robots are autonomous; they are not under the control of a surgeon.
The first generation of surgical robots is already installed in a number
of hospitals around the world. It is claimed that more than 3.5 million
robot-aided medical procedures are performed per year in the USA
alone. These are not autonomous robots that can perform surgery on its
own, but they are lending a mechanical helping hand to the surgeon. So
far, these machines have been used to position an endoscope, perform
gallbladder surgery and correct gastroesophageal reflux disease. The
first unmanned robotic heart surgery took place in Rome, Italy, in May,
2006.

ARTIFICIAL HEART
An artificial heart has been developed by an American company. Made
of titanium and plastic, it is about the size of a soft ball and weighs about
one kilogram. It is driven by a rechargeable internal battery implanted in
the patient’s abdomen. The mechanical heart completely replaces the
patient’s heart, which is removed prior to the implantation of the
mechanical heart. The device is intended to be used in patients with end-
stage heart failure in whom the human transplant is not feasible. So far
the mechanical heart has been tested in 14 men, who lived 5 to 17
months after the operation. The device costs about 200,000 US dollars.
Some people are willing to spend this much money to enjoy one more
Christmas or one more birthday. The use of mechanical heart was
sanctioned in the USA in September, 2006.
Medical Marvels of 21st Century | 405

FETAL SURGERY
This term means the surgical treatment of certain life- threatening
congenital abnormalities in a fetus. Surgical intervention during pregnancy
on the fetus is meant to correct problems that would be too advanced to
be corrected after birth. Fetal surgery is the latest development in the art
of surgery.
Fetal intervention was first successfully attempted in 1963 with
the transfusion of blood in a fetus. Subsequently, the testing of amniotic
fluid (amniocentesis) for the diagnosis of congenital chromosomal defects
like Down’s syndrome has been practiced for over 20 years.
The development of ultrasound and its widespread use in 1970s
gave the doctors a “real good look” at the developing fetus. Now congenital
defects began to be diagnosed in utero, but except termination of
pregnancy, there was no other option with the doctors. By 1980s
experimental fetal surgery began to be tried in lambs and monkeys. The
first fetal surgery was performed in 1981 for the treatment of obstructive
uropathy. Subsequently, the operation has been done for the treatment
of congenital disorders such as sacrococcygeal teratoma, diaphragmatic
hernia, and spina bifida. Even now not more than three or four surgical
centers in the whole world have expertise to perform such surgeries.
In the open fetal surgery, after a cesarean section on the uterus, the
part of the fetus that needs surgery is exposed out of the uterus. After
corrective surgery, the fetus is returned to the uterus (the incisions are
closed) and pregnancy is allowed to continue till term. In the other
method, called fetoscopic surgery, the surgical procedure is done without
any major incision on the abdomen or the uterus. An endoscope is used
to do the corrective surgery.
406 | History of Medicine

GENE THERAPY
“Bubble Boy” was the nick-name given to 18-month-old boy who
suffered from a congenital, potentially fatal defect called the severe
combined immunodeficiency syndrome. The boy, Rhys Evans, had spent
most of his life in the sterile environment of a hospital, since his body
had no resistance to infections, due to a failure of the development of the
immune system. The defect was localized to a single gene. In one of the
first treatment of its kind, British doctors replaced the defective gene by
a normally functioning gene. The operation was done in 2002. Now the
child is living with his parents, taking part in all activities usual for his
age. It was the first case of gene therapy. In years to come, it may be
expected that many other congenital disorders would be cured by gene
therapy.

SCORPION VENOM—
A DIAGNOSTIC TOOL

Surgery remains the mainstay of cancer treatment. Despite the availability


of imaging techniques like MRI, it is often difficult to demarcate the
boundaries of the malignant tissue. Consequently, the few malignant
cells left inadvertently behind multiply and cause recurrence of the
malignancy. This problem is more common in brain tumors where the
surgeon would like to save as much healthy tissue as possible. But
tumors like gliomas are difficult to demarcate. Consequently, such tumors
have a recurrence rate as high as 80 percent and the fresh malignant
tissue is usually found at the edge of the resected area.
The sting of certain species of scorpions contains a chemical which
has been named as chlorotoxin. Chlorotoxin has a property of binding to
the malignant cells but not to the healthy calls nearby. By attaching a
Medical Marvels of 21st Century | 407

fluorescent marker to chlorotoxin, the malignant cells can be identified in


the living tissues. This technique, acting like tumor paint has 500 times
greater accuracy in separating malignant mass from the adjacent healthy
tissue than magnetic resonance imaging. The “tumor painting” has the
potential to be a noninvasive screening tool for the early detection of
various cancers as well as identifying the tumor-positive lymph nodes.
References

Abortion: http://en.wikipedia.org/wiki/History_of_abortion
Abortion: http://www.cbctrust.com/history_law_religion.php
Adrian: http://ca.geocities.com/med_1982on/adrian-bio.html
Adrian: http://en.wikipedia.org/wiki/Edgar_Douglas_adrian
Adrian: http://nobelprize.org/nobel_prize/medicine/laureates/1932/
adrian-lecture.htm
Adrian: http://www.answers.com/topic.edgar-adrian-1st-baron-adrian
Alcohol: http://en.wikipedia.org/wiki/Prohibition
Alcohol: www.searo.who.int/LinkFiles?Facts
Aldini: http://64.233.167.104 search?q=cache:UewpSAaw 0YJ:chem
Alternative medicine: http://altmedicine.about.com/od/alternative
medicinebasics.a/lancet_homeopath.htm
Alternative medicine: http://en.wikipedia.org/wiki/Alternative-medicine
Alternative medicine: http://en.wikipedia.org/wiki/Hydro-therapy
Alternative medicine: http://psychicinvestigator.com/Occult?Mesmr.htm
Alternative medicine: http://sciam.com/print_version.cfm? articleID=131
CED4F-E7F2-99DF-3C84BB412D1D3
Alternative medicine: http://www.cancer.org/docroot/ETO/content/
content/ETO_5_3X_Turmeric.asp?sitearea=ETO
Alternative medicine: http://www.heartlandnaturopathic.com/
history.htm
Alternative medicine: http://www.homeopathic.com/articles/intro/
history.php
Alternative medicine: http://www.hpathy.com/Status/homeopathy-
history.asp
410 | History of Medicine

Alternative medicine: http://www.yourhealthbase.com/alternative_


medicine.html
Alternative medicine: http;//findarticles.com/p/articles/mi_m2843/
is_n4_v22/ai_20915033
Alzheimer: http://www.ibro.org/Pub_Main_Display.asp?Main-ID=34
Alzheimer: http://www.whonamedit.com/doctorcfm/177.html
Anatomy:http://www.newadvent.org/cathen/01457e.htm
Anticoagulants: http://en.wikipedia.org/wiki/Warfarin
Anticoagulants: http://ndt.oxfordjournals.org/cgi/content/full/15/7/1086
Anticoagulants: http://www.ovc.uoguelph.ca/PathoBio? documents?
Hayes-schofieldlectures2003.pdf
Anticoagulants: http://www.rds-online.org.uk/pages.asp?i_ ToolbarD=3
Apollo:http://www.answers.com?topic/Apollo
Apollo:www.angelfire.com/realm2/amethysbt/godsapollo.htm
Arabian medicine: http://www.globalcomment.com/science&technology/
article_14.asp
Aristotle: http://www.ucmp.berkley.edu/history/Aristotle.html
Aristotle:www.ucmp.berkeley.edu/history/Aristotle.htm
Asclepios :www.newsfinder.org
Asclepius:http:en.wikipedia.org/wiki/Rod_of_Asclepius
Aspirin: http://www.medicine.mcgill.ca/mjm/v02n02/aspirin.html
Auenbrugger: http://www.antiquemed.com/invention.html
Auenbrugger: http://www.newadvent.org/cathen/02072a. html
Axel: http:/nobelprize.org/medicine/laureates/2004/axel-autobio.html
Ayurveda : wikipedia.org/wiki/Ayurveda
Ayurveda: http://en.wikipedia.org/wiki/Ayurveda
Ayurveda: http://www.ayurvedahc.com/articlelelive/articles/11/1/The-
First-World-Medicine
Ayurveda: http://www.vigyanprasar.com/comcom/inter48.htm
References | 411

Babinski: http://whonamedit.com/doctor.cfm/370.html
Baillie: http://www.aim25.ac.uk/cgi-bin/frames/ fulldesc? insttt_id=
8&coll_id=7094
Baillie: http://www.electricscotland.com/history/other/Baillie_
matthew.htm
Banting: http://en.wikipedia.org/wiki/Frederick_Benting
Banting: http://poll.imbd.com/name/nm1204775/bio
Banting: http://www.britannica.com/eb/article-9013217
Banting: http://www.pbs.org/wgbh/aso/databank/entries/dm22in.html
Barany: http://nobelprize.org/nobel_prizes/medicine/laureates/1914/
barany-bio.htm
Barany: http://www.whonamedit.com/doctor.cfm/639.html
Barnard: http://news.bbc.co.uk?1/hi/health/1470356.stm
Barnard:http://www.answers.com
Beaumont: http://sportsci.org/news/history/beaumont/beaumont.htm
Beaumont: http://www.bookrags.com/sciencehistory/william-beaumont-
scit-051234.htm
Beaumont: http://www.britannica.com/eb/article-9014002
Behring: http://www.answers.com/topic/email-adolf-von-behring
Bekesy: http://nobelprize.org/nobel_prizes/medicine/laureates/1961/
bekesy-bio.html
Bell: http://www.electricscotland.com/esegi/print.pl
Bell: http://www.whonamedit.com/doctor.cfm/2103.htm
Bell: http://www.whonamedit.com/doctor.cfm/2103.htm
Berger: http://chem..ch.huji.ac.il/~history/berger.html
Berger: http://www.medscape.com/medicine/abstract/16334737?query
Text=electroencephalography
Bernard: http://en.wikipedia.org/wiki/Claude_Bernard
Bernard: http://users.wmin.ac.uk/~mellerj/physiology/bernard.htm
Bernard: http://www.longman.co.uk/tt_secsci/resources/scimon/nov_00/
bernard.htm
412 | History of Medicine

Bernard: http://www.sportsci.org/news/history/bernard.html
Bernard: http://www.whonamedit.com/doctor.cfm/846.htm
Billroth: http://en.wikipedia.org/wiki/Theodore_Billroth
Billroth: http://www.whonamedit.com/doctor.cfm/2343.htm
Billroth: http://www.whonamedit.com/doctor.cfm/2343.html
Black Death: http://www.themiddleages.net/plague.html
Black death: http://www.watchtower.org/library/g/2000/2/8/
article_01.htm
Blackwell
Blackwell: http://www.spartacus.schoolnet.co.uk/USACblackwell.htm
Blood pressure: http://www.medphys.ucl.ac.uk/teaching/undergrad/
projects/2003/group_03/history.html
Blood pressure: http://www.vasotrac.com/medhistbp.htm
Blood transfusion: http://en/wikipedia.org/w/index.php?titlt=
Blood_transfusion
Blood transfusion: http://www.health.gov.mt/nbts/history.htm
Boerhaave: http://www.whonamedit.com/doctor.cfm/2404.html
Brown dog: http://www.answers.com/topic/brown-dog-affair
Brown-Sequard: http://en.wikipedia.org/wiki/Charles-Cdouard_Brown-
S%C3%A9quard
Businessmen surgeons: http://www.ssha.org/gender/bass.html
Cajal: http://en.wikipedia.org/wiki/Santiago_ram%c3% B3n_y_cajal
Cajal: http://www.psu.edu/nasa/cajal.htm
Cancer: http://quote.bloomberg.com/apps/news?pid= 10000103
Cancer: http://www.cancer.org/docroot/CRI/content/CRI_2_6x_the
history -of_cancer_72.asp?sitea....
Cancer: http://www.rare-cancer.org/history-of-cancer.html
Cannon: http://en.wikipedia.org/wiki/Walter_Cannon
Cannon: http://www.harvardsquarelibrary.org/Unitarians/cannon_
walter.html
References | 413

Cannon: http://www.harvardsquarelibrary.org/Unitarians/cannon_
walter.html
Carrel: http://en.wikipedia.org/wiki/Alexis_Carrel
Cesarean section: http://www.nih.gov/exhibition/cesarean/cesarean_
2.html
Cesarean section: http://www.obgyn.net/displayarticle. asp?page=
urogyn/Murphy-book/cs-page2
Chamberlen: http://uh.edu/engines/epi2018.htm
Charak :www.vaipani.com/related_links.htm
Charcot: http://en.wikipedia.org/wiki/Jean-Martin_Charcot
Charcot: http://www.indiana.edu/~intell/charcot.shtml
Charcot: http://www.whonamedit.com/doctor.cfm/19.html
Charles Richet: http://en.wikipedia.org/wiki/In_vitro_ fertilization
Chinese medicine: http://www. taijichinesemedicine.com/bienque.htm
Chinese medicine: http://www.americanacupuncture.com/history.htm
Chinese medicine: http://www.amfoundation.org/tem.htm
Chinese medicine: http://www.answers.com/topic/traditional-chinese -
medicine
Chinese medicine: http://www.taijichinesemedicine.com/zhangzhon-
gjing.htm
Chinese medicine: http;//www.myoclinic.com/health/acupuncture/
SA00086
Cholera: http://www.ph.ucla.edu/EPI/snow/firstdiscovered-cholera.html
Cohnheim: Fye,WB: Clin.Cardiol:25.575-577,2002
CompoundWebsites/2002
Contraceptives: http://www.medhunters.com/articles/historyOfMale
Contraception.html
Contraceptives: http://www.pbs.org/wgbh/amex/pill/peopleevents/
p_sanger.html
Curie: http://en.wikipedia.org/wiki/Marie_Curie
Curie: http://womenhistory.about.com/od/mariecurie/p/curie.htm
414 | History of Medicine

Curie: http://www.jfcr.or.jp/Ra100/plan-e/mission-e.html
Curie: http://www.motivational-inspirational-corner.com/getquote.
html?authorid=3
Curie; http://www.diplomatie.gouv.fr/lebal_france?ENGLISH/
SCIENCES/CURIE/marie.html
Cushing: Archives of Pathology and Laboratory Medicine: 125,1539-
1541, 2001
Cushing: html://www.whonamedit.com/doctor.cfm/980.html
Cushing: http://en.wikipedia.org/wiki/Harvey_Cushing
Cushing: http://www.sd-neurosurgeon.com/cushing.htm
Dale: http://nobelprize.org/nobel_prizes/medicine/laureates/1936/dale-
lecture.html
Darwin: http://en.wikipedia.org/wiki.Charles_Darwin
Dogmagk: http://nobelprize.org/medicine/laureates/1939/dogmagk-
bio.html
Dogmagk: http://www.bayer.com/about-bayer/history/ biographies/
gehard-dogmagk/page 1220.htm
Donders: http://www.bookrags.com/biography/frans-cornelis-donders-
wap/
Donders: http://www.history-ophthalmology.com/instru-mentsw.html
Donders: http://www.mrcophth.com/Historyofophthalmology /
physiology.htm
Duchenne: http://www.whonamedit.com/doctor.cfm/950.htm
Dunant: http://en.wikipedia.org/wiki/Henry_Dunant
Dunant: http://nobelprize.org/nobel_prizes/peace?laureates/1901/
dunant-bio.html
Eccles: http://en.wikipedia.org/wiki/Sir_John_Carew_Eccles
Eccles: http://jcmr.anu.edu.au/about/nobelprize/eccles.php
Eccles: http://nobelprize.org/nobel-prize/medicine/ laureates/ 1932/
press.html
References | 415

EEG: http://torque.oncloud8.com/archives/cat_history_of_ eeg.html


Ehrlich: http://en.wikipedia.org/wiki/Paul-Ehrlich
Ehrlich: http://nobelprize.org/medicine/laureates/1908/ehrlich-bio.html
Einthoven: http://en.wikipedia.org/wiki/Willem_Einthoven
Electroconvulsive Therapy: http://en.wikipedia.org/wiki/Electro-
convulsive_therapy
Elion: http://en.wikipedia.org/wiki/Gertrude-B_Elion
Elion: http://www.chemheritage.org/EducationalServices? pharm/chemo/
readings/lifeline.htm
Endocrines: http://www.hero.ac.uk/uk/research/archives/2005/how_
hormones_happened.cfm
Eugenic Movement: file://A:\A science Odyssey
Eugenic movement: http://en.wikipedia.org/wiki/Eugenics
Evolution of Modern medicine: http://etext.lib.virginia.edu/etcbin/toccer-
new2?id=OslEvol
Falloppio: http://www.whonamedit.com/doctor.cfm/2288. htm
Fetal surgery: http://www.childrenshospital.org/az/Site891/mainpage
S891P0.html
Fetal surgery: http://www.fetal-surgery.com/
Fetal surgery: http://www.med.wayne.edu/Wayne%20 Medicine/wm96/
frontiers.htm
Flaming; http://en.wikipedia.org/wiki/Alexander_Fleming
Fleming: http://en.wikipedia.org/wiki/Alexender_Fleming
Fleming: http://inventors.about.com/library/inventors/blpenicillin.htm
Fleming:http;//www.historylearningsites.co.uk/Alexander_fleming_
and_penicillin.htm
Freeman: http://en.wikipedia.org/wiki/Walter_Freeman
Freud: http://en.wikipedia.org/wiki/Sigmund_Freud
Freud: http://www.brainyquote.com/quotes/authors/s/sigmund_
freud.html
416 | History of Medicine

Freud: http://www.freudfile.org/cocain.htm
Friedreich: http://www.whonamedit.com/doctor.cfm/1390.htm
Galen: http://www.bbc.co.uk/history/historic_figures/galen_
claudius.shtml
Galen: http://www.healthsystem.virginia.edu/internet/library/historical/
artifacts/antique/galen.cfm
Galen:http;/en.wikipedia.org/wiki/Galen
Galen:www.historylearningsite.co.uk/Claudius_galen.htm
Galvani: http://inventors.about.com/library/inventors/bl_Galvani.htm
Garrett: http://en.wikipedia.org.wiki/Elizabeth _Garrett_ Anderson
Garrett: http://womenshistory.about.com/od/physicians/p/e-
g_anderson.htm
Golgi: http://nobelprize.org/medicine/articles/golgi/
Graefe: http://www.whonamedit.com/doctor.cfm/359.html
Graves: http://www.whonamedit.com/doctor.cfm/695.html
Haldane: http://www.divernet.com/cgi-bin/articles.pl?id= 2602&section
=1040&action.htm
Haldane: http://www.dmm.org.uk/archives/a_orbit20.htm
Hales: http: www.nndb.com/people/146/000085888/
Hales: http://en.wikipedia.org/wiki/Stephen_Hales
Hales: http://jap.physiology.org/cgi/content/abstract/57/3/635
Hales: http://www.britannica.com/eb/article-9038874?query= William
%20Harvey7ct=eb
Halsted: http://en.wikipedia.org/wiki/William_Stewart_Halsted
Halsted: http://www.whonamedit.com/doctor.cfm/2944.html
Harvey: http://www.timelinesscience.org/resource/students/blood/
Harvey.htm
Heart surgery: http://www.pbs.org/wgbh/nova/heart/pioneers.html
Helmholtz: http://en.wikipedia.org/wiki/Hermann_von_ Helmholtz
References | 417

Hess: http://nobelprize.org/nobel_prizes/medicine/laureates/1949/hess-
bio.html
Hess: http://www.britannica.com/eb/article-9040286
Hip replacement: http://en.wikipedia.org/wiki/Hip_replacement
Hippocrates: http://www2.sjsu.edu/depts./Museum/hippoc.html
History of Ayurveda: http://www.holistic.ie/essays/ayur.2.htm
History of Ayurveda: http:hpn.healingpeople.com/index.php?
option=com_content
History of immunology
History of medicine: http://en.wikipedia.org/wiki/History_ of_medicine
History of medicine: http://historyworld.net/wrldhis/PlainText
Histories.asp?groupid=478
History of Medicine: http://lish.lsuhsc.edu/fammed/grounds/history.htm
History: http://pacs.unica.it/biblio/lesson7.htm
http://en.wikipedia.org/wiki/In_vitro_fertilization
http://womenshistory.about.com/library/bio/blbio_ blackwell_eliz.htm
http://www.mrcophth.com/Historyofophthalmology/anatomy.htm
http://www.shmu.edu.cn/jcyxy/mianyi/History%20of%20
immunology.htm
http://www-personal.umd.umich. edu/~jonsmith/ 19cmed. htm
Hunter John: http://www.smj.org.uk/osler2000.htm
Hunter John: http://www.uh.edu/engines/epi1131.htm
Hunter William: http://www.answers.com/topic/William-hunter
Hygieia:http://en.wikipedia.org/wiki/Hygieia
Hypertension: http://www.hypertensionindia.com/milestones 1.htm
Hypertension: http://www.medscape.com/viewarticle/421419
Ig Nobel prize: http://en.wikipedia.org/wiki/IgNobel_prize
Ig Nobel prize: http://www.bbc.co.uk/dna/h2g2/classic/A973677
Ignarro: http://nobelprize.org/nobel_prizes/medicine/laureates/1998/
ignarro-autobio.html
Immunology: http://file://A\Brief history of immunology.htm
418 | History of Medicine

In vitro fertilization:
Incubators: http://www.citipages.com/databank/24/1190/article11519.
asp
Incubators: http://www.historycooperative.org/journals/ht/35.1/
Lieberman.html
Incubators: http://www.neonetology.org/pdf/7200377a.pdf
Indian medicine: http://www.histmedindia.org
Influenza epidemic: http://www.pbs.org/wgbh/aso/databank/entries/
dm18fl.htm
Jenner: http://en.wikipedia.org/wiki/Edward_Jenner
Jenner: http://www.bbc.co.uk/history/historic_figures/jenner_
edward.shtml
Jenner: http://www.jennermuseum.com/ej/cuckoo.shtml
Kandel: http://en.wikipedia.org/wiki/Eric_R_Kandel
Kandel: http://www.hhmi.org/research/investigators/kandel.html
Koch: http://answers.com/topic/robert-koch
Koch: http://historylearningsite.co.uk/robert_koch.htm
Koch: http://nobelprize.org/medicine/educational/tuberculosis/
readmore.htm
Koch: http://nobelprize.org/medicine/laureates/1905/koch-bio.htm
Koch: http://www.britannica.com/nobel/micro/325_28.htm
Kolff: http://www.achievement.org/autodoc/page/ko10bio-1
Kolff: http://www.davita.com/articles/dialysis/index.shtml? id=197
Kolff: http://www.nature.com/ki/journal/v62/n5/full/4493309a.html
Krogh: http://nobelprize.org/nobel_prizes/medicine/articles/krogh/
index.html
Krogh: http://noberprize.org/nobel_prizes/medicine/laureates/1920/
krogh-bio.html
Laennec: http://www.newadvent.org/cathen/08737b.html
Laennec: http://www.whonamedit.com/doctor.cfm/527.html
References | 419

Laennec: http://www2.umdnj.edu/~shindler/hxdx.html
Landsteiner: http://nobelprize.org/nobel_prizes/medicine/laureates/1930/
landsteiner-bio.html
Landsteiner: http://www.whonamedit.com/doctor.cfm/2794.htm
Laryngoscopy: Cooper RM: CAN J ANAESTH:51,1-5, 2004
Lauterbur : http://nobelprize.org/medicine/laureates/2003/lauterbur-
autobio.html
Leeuwenhoek: http://www.slic2.wsu.edu:82.hurlbert/micro101/pages/
Chap1.htm
Leonardo: http://en.wikipedia.org/wiki/Leonardo_da_Vinci
Leonardo: http://en.wikipedia.org/wiki/Leonardo_da_Vinci
Leonardo: http://www.geocities.com/historyday2001/anatomy.html?
20065
Leonardo: http://www.mos.org/leonardo/scientist.html
Leonardo: http://www.stanford.edu/~mgoman/essays/Sarah.htm
Lister: http://www.famouspeople.co.uk/j/josephlister.html
Lister: http://www.historylearningsite.co.uk?joseph_lister
Lister: http://www.nahste.ac.uk/isaar/GB_0237_NAHSTE_ P0389.html
Livingstone: http://www.livingstoneonline.ucl.ac.uk/biog/dl/bio.html
Lobotomy: http://auto.tao.ca/node/view/990?OHPSESSID =30
Lobotomy: http://npr.org/templates/story.php?storyId= 5014080
Lobotomy: http://www.lobotomy.info/adventures.html
Loewi: http://en.wikipedia.org/wiki/Otto_Loewi
Ludwig: http://en.wikipedia.org/wiki/Carl_Ludwig
Ludwig: http://www.britannica.com/eb/article-9049277
Mackenzie: http://en.wikipedia.org/wioki/Morrell_Mackenzie
Magendie: http://en.wikipedia.org/wiki/Fran%C3%a7ois_ Magendie
Magendie: http://www.whonamedit.com/doctor.cfm/2104.htm
Malaria: http://archives.idrc.ca/books/reports/1996/01-05e.htm
Malaria: http://bell.lib.umn.edu/Products/cinch.htm
420 | History of Medicine

Malaria: http://sres.anu.edu.au/associated/fpt/nwfp/quinine/Quinine.htm
Malaria: http://web.caryaacademy.org/chemistry/rushin/Studentprojects/
Malaria: http://www.malaria-ipca.com/mw_history.htm
Malaria: http://www.newadvent.org/cathen/08372b.htm
Malpighi: http://en.wikipedia.org/wiki/Marcello_Malpighi
Malpighi: http://micro.magnet.fsu.edu/optics/timeline/people/
malpighi.html
Malpighi: http://www.spaceship-earth.org/Biograph/Malpighi.htm
McDowell: http://elane.stanford.edu/Wilson/Text/4j.html
Medical missionaries: http://bge.gospelcom.net/emis/vrekenmono/
vreken1.htm
Medical saints: http://pacs.unica.it/biblio/lesson2.htm
Medical schools:
Medieval medicine: http://en.wikipedia.org/wiki/Medieval_ medicine
Medieval medicine: http://medieval-castles.org/index.php? cat=16
Medieval medicine: http://www.learningsite.co.uk/medicine-
in_the_middle_ages.htm
Medieval medicine: http://www.schoolscience.co.uk/content/4/biology/
abpi/history/history5.html
Medieval medicine: http://www.wantage.com/museum/Local_History/
Medieval%20Hospitals.pdf
Meniere: http://www.whonamedit.com/doctor.cfm/1859.htm
Miasma: http://en.wikipedia.org/wiki/Miasma_theory_ of_disease
Miasma: http://www.medterms.com/script/main/art.asp?
articlekey=19304
Monaz: http://en.wikipedia.org/wiki?egas_Moniz
Moniz: http://www.britannica.com/eb/article-9032065? query=moniz
Morgagni: http://en.wikipedia.org/wiki/Giovanni_Battista_ Morgagni
Morgagni: http://www.whonamedit.com/doctor.cfm/312.html
Murray: http://www.answers.com/topic/Joseph-murray
Murray: http://www.joelmbabb.com/transplant 1.htm
References | 421

Nightingale: http://www.answers.com/topic/Florence-nightingale
Nightingale: http://www.satucket.com/lectionary/Florence_ Nightingale.
htm
Nightingale: http://www-groups.des.st-and.ac.uk/ history/ Printonly /
Nightingale.html
Nightingale:http://www.victorianweb.org/history/crimea/florrie.htm
Nineteenth century London: http://www.fidnet.com/~dap1955/dickens/
dickens_london.htm
Nineteenth century London: http://www.geocities.com/victorianmedicine
/entire.html?20071
Nissl: http://www.whonamedit.com/doctor.cfm/2465.htm
Nitric oxide: http://query.nytimes.com/gst/fullpage.hmt
Nitric oxide: http://www.absw.org.uk/Briefings? Nitric%20oxide.htm
Nobel: http://history1900s.about.com/library/weekly/aa042000a.htm
Nobel: http://www.lucidcafe.com/library/95oct/alfnobel.html
Ophthalmology anatomy:
Orthopedic surgery: http://en.wikipedia.org/wiki/Orthopedic_ surgery
Orthopedics: http://www.worldortho.com/pg3.htm
Orthopedics: http://www.worldortho.com/pg3.htm
Paintal: Current Science: 88,513-514,2005
Pare: http://en.wikipedia.org/wiki/Ambroise-Pare
Pare: http://www.bookrags.com/sciences/sciencehistory/amputation-
woi.html
Pare: http://www.britannica.com/ebi/article-9276283
Pare: http://www.general-anaesthesia.com/images/ambroise-pare.htm
Pare: http://www.hap.be/English/ambroise%20pare.htm
Pare: http://www.newadvent.org/cathen/11478a.htm
Pare: http://www.uh.edu/engines/epi327.htm
Pasteur: http://en.wikipedia.org/wiki/Louis_Pasteur
Pasteur: http://www.answers.com/topic/louis-pasteur
422 | History of Medicine

Pasteur: http://www.historylearningsite,co.uk/louis_pasteur.htm
Pavlov: http://nobelprize.org/medicine/educational/Pavlov/readmore.html
Pavlov: http://nobelprize.org/medicine/laureates/1904/pavlov-bio.html
Penfield: http://www.pbs.org/wgbh/also/databank/entries/bhpenf.htm
Penfield: http;//www.whonamedit.com/doctor.cfm/3099.html
Population Explosion; http://www.yale.edu/ynhti/curriculum/units/1998/
7/98.07.02.x.htm
Pott: http://www.whonamedit.com/doctor.cfm/1103.htm
Priestley: http://home.nycap.rr.com/useless/peristly.html
Psychiatry: http://www.cerebromente.org.br/n04/historia/shock_i.htm
Psychiatry: http://www.dushkin.com/connectext/psy/ch14/treat.mhtml
Psychotropic Drugs: http://www.postgradmed.com/issues/1997/03_97/
psych.htm
Red Cross: http://en.wikipedia.org./wiki/Red_Cross
Red Cross: http://www.redcross.ie/about/irc.html
Renaissance: http://en.wikipedia.org/wiki/Renaissance
Renaissance: http://www.twingroves.district96.k12.il.us/Renaissance?
Hospital/Hygiene.html
Ridley: http://en.wikipedia.org/wiki/Harold_Ridley_ (ophthalmologist)
Riva-Rocci :http://www.whonamedit.com/doctor.cfm/1194.htm
Robotic surgery: http://en.wikipedia.org/wiki/Robotic_ surgery
Rontgen: http://www.absoluteastronomy.com/ref/Wilhelm_ conrad_r%
C3%B6ntgen
Ross: http://en.wikipedia.org/wiki/Ronald_Ross
Selye: http://www.brainconnection.com/topics/printindex.php
3main=fa/selye
Semmelweis: http://en.wikipedia.org/wiki/Ignaz_Semmelweis
Semmelweis: http://www.whonamedit.com/doctor.cfm/354.html
Sethi: http://www.time.com/time/reports/heroes/foot.html
References | 423

SGS medical college: http://en.wikipedia.org/wiki/Seth_ Gordhandas_


Sunderdas_Medical college_and_King_Edward_ Memorial_Hospital
Sherrington: http://www.whonamedit.com/doctor.cfm/2266.htm
Sims: http://www.mja.com.au/public/issues/178_12_160603/dec
10116_fm.html
Spallanzani: http://www.powells.com/biblio?show= 0805073167&
page=excerpt
Starling: http://www.whonamedit.com/doctor.cfm/1188.htm
Sulfa drug: http://infoplease.com/ce6/sci/A847153.htm
Surgery: http://en.wikipedia.org/wiki/surgery
Sushruta: Chari PS, Indian J plast Surg:36,4-13,2003
Sydenham: http://www.sydenham.org.uk/Thomas_ sydenham.html
Sydenham: http://www.whonamedit.com/doctor.cfm/1989.html
Symbols of Modern Medicine: www.annals.org/cgi/content/abstract/138/
8/673
Syphilis: http://en.wikipedia.org/wiki/Syphilis
Syphilis: http://en.wikipedia.org/wiki/syphlis
Tait: http://en.wikipedia.org/wiki/Robert_Lawson_Tait
Tait: http://www.freewebs.com/scientific_anti_vivisectionism
4?asepsisandantisepsis.htm
Telemedicine: http://en.wikipedia.org/wiki/Telemedicine
Thalidomide: http://en.wikipedia.org/wiki/Thalidomide
Thalidomide: http://www.rlc.deeed.edu/MATHSCI/Reynolds/
thalidomide/history/history.html
Thermometer: http://qjmed.oxfordjournals.org/cgi/content/full/95/4/251
Transplant surgery: file://A\History of Transplant immuno-biology.htm
Truth serum: http://en.wikipedia.org/wiki/LSD
Truth serum: http://news.bbc.co.uk/2/hi/uk_news/1740746.stm
Truth serum: http://www.consortiumnews.com/2002/060402a.htm
Truth serum: http://www.druglibrary.org/schaffer/history/e1950/mkultra/
Hearing04.htm
424 | History of Medicine

Tuberculosis: http://en.wikipedia.org/wiki/Tuberculosis
Tuberculosis: http://www.goshen.edu/bio/Biol206/Biol206LabProject/
tricia/Tbhx.htm
Uroscopy: http://www.doctorsreview.com/archives/2005/no_09/
sep05_history.html
Vakil: Journal,Indian Academy of Clinical Medicine: 3: 2002
Vakil: Texas Heart Institute Journal: 33,161-168,2006
Vesalius: http://en.wikipedia.org/wiki/Vesalius
Vesalius: http://en.wikipedia.org/wiki/Vesalius
Vesalius: http://www.bbc.co.uk/history/historic_figures/vesalius_
andreas.shtml
Vesalius: http://www.newadvent.org/cathen/15378c.htm
Viagra: http://en.wikipedia.org/wiki/Viagra
Virchow: http://en.wikipedia.org/wiki/Rudolf_Virchow
Virchow: http://www.mnsu.edu/emuseum/information/biography/
uvwxyz/virchow_rudolf.htm
Virchow: http://www.pathguy.com/virchow.htm
Virchow: http://www.whonamedit.com/doctor.cfm/912.html
Vitamins: http://chemistry.gsu.edu/glactone/vitamins/b1/
Vitamins: http://dannyreviews.com/h/Beriberi.html
Vitamins: http://en.wikipedia.org/wiki/Vitamin
Vitamins: http://pediatrics.aappublications.org/cgi/content/full/108/4/e76
Vitamins: http://www.bbc.uk/history/discovery/exploration/
captaincook_scurvy_01.shtml
Vitamins: http://www.bookrags.com/sciences/sciencehistory/vitamin-
b12-wsd.html
Vitamins: http://www.britannica.com/nobel/micro/64_72.htm
Vitamins: http://www.mnwelldir.org/docs/history/vitamins.htm
References | 425

Waksman: http://en.wikipedia.org/wiki/Selman_Waksman
Warren: http://www.tallpoppies.net.au/cavalcade/warren.htm
Wet nursing: http://www.lalecheleague.org/llleaderweb/LW/LVJulAug95
p53.html
Withering: http://pubs.acs.org/subscribe/journals/mdd/v05/i03/html/
03timeline.htm
Withering: http://www.ganfyd.org/index.php?title=William _Withering
Withering: http;//www.jameslindlibrary.org/trial_records/17th_18th_
Century/withering/withering_com
Young: http://www.whonamedit.com/doctor.cfm/1715.html
Young:http://en.wikipedia.org/wiki/Thomas-young-(scientist)
Index

A Andreas Vesalius 44
Abnormal heart sounds 88 Antipyretic era 175
Abortion 385 Anton Leeuwenhoek-the first
Abu Ali Ibn Sina 35 microbiologist 63
Acupuncture 329 Apollo 11
Addison’s monograph 243 Arabic medicine 2
Adiposogenital syndrome 115 Aristotle 19
Adrenal cortical hormones 245 Artificial heart 404
Advances in physiology of digestion Asclepius 11
90 August Krogh 220
Air thermoscope 87 Ayurvedic medicine 2
Alexander Fleming— the beginning
of antibiotic era 226 B
Alfred nobel 180
Babinski’s sign 114
Alois Alzheimer 102
Baby incubators 395
Alternative medicine 327
Alzheimer’s disease 103 Barefoot doctors of China 306
Ambroise Pare 54 Black death 31
American research workers in Bloodletting and purgatives–
medicine: from zero to panacea for all inflammations
heroes 191 39
Amputation 55 Body proportions of man by
Ancient Chinese medicine 7 Leonardo 48
Ancient concept of the eyeball 249 Brown dog affair 193
Ancient Indian medicine 2 Brown-Séquard 122
428 | History of Medicine

C Discovery of Helicobacter pylori


Caduceus of Hermes 13 322
Camillo Golgi 198 Discovery of T-and B-lymphocytes
Carbolic acid spray 141 347
Carl Ludwig 107 Discovery of vitamins 238
Cataract surgery 357
Cerebral cortex 102 E
Cesarean section 383 Edward Jenner 81
Charak Samhita 2,4 Egas Moniz 251
Charles Bell 95 18th century Europe–an export
Charles Darwin 124 house of diseases 41
Chinese meridians 10 Electroconvulsive therapy 260
Christian Barnard 274 Electroencephalography 269
Claude Bernard 118 Eli Metchnikoff 342
Claudius Galen 21 Elizabeth Blackwell 155
Cocain episode 116 Elizabeth Garrette Anderson 159
Contraceptives 387 Emil Behring 341
oral contraceptives 389 Eric Richard Kandel 301
vasectomy and IUCDs 388 Eugenic movement 312
Cowpox 82 Extracapsular extraction of cataract
358
D
Dark ages in western medicine 26 F
David Livingstone 169 Fabricius Geronimo 57
Development of thermometer 86 Felix Hoffmann 176
Dhanvantari school of ayurveda 3 Fetal surgery 405
Digitoxin 81 Fever treatment of mental disorders
Diphtheria 341 258
Discovery of anticoagulants 276 Florence Nightingale 149
Discovery of endocrines 242 Foxglove 80
Index | 429

François magendie 93 Henry Dale 263


Franz Nissl 104 Hippocrates 14
Frederick banting discovery of Hippocrates’ four humors 18
insulin 246 black bile 18
Freeman’s Lobotomy 254 blood 18
First artificial kidney 294 phlegm 18
From mal’aria to malaria 215 yellow bile 18
Hippocratic aphorisms 17
G Hippocratic collection 16
Gabriele Falloppio 52 Hippocratic oath 15
Gastrointestinal medicine 92 Historical image of women as
Gene therapy 406 patients 130
Gertrude Elion 279 History of
Giovanni Morgagni 69 alcohol 315
Glycoside 81 blood transfusion 210
Golden era of Arabic medicine 32 cancer 285
Golden era of surgery 213 corruptions of Christianity 78
Graefe’s method 355 development of specialities
Guillaume Duchenne 105 339
electrodiagnostic techniques
265
H
immunology 340
Hans Selye 287 obstetrics and gynecology–
Harvey Williams Cushing 205,206 childbirth 378
Health care in 19th century London ophthalmology–ancient
126 concepts of anatomy of
hospitals 129 the eye 348
physicians/doctors 127 organ transplantation 296
sanitation 126 orthopedics–Nicolas Andry—
surgeons 128 the birth of orthopedics
Helmholtz’s eye mirror 352 366
430 | History of Medicine

otorhinolaryngology–history John Charnley joint replacement


of laryngoscope 360 therapy 374
tobacco smoking and lung John Hunter 73
cancer 282 Joseph Babinski 113
tuberculosis 231 Joseph Priestley 77
western medical education in
Josephe Murray 298
India 169
Journal de experimentale 94
Horace Wells 160
Human skeleton by Leonardo 48 Julius Cohnheim 100
Hysteroepilepsy 113 Juxtacapillary pulmonary receptors
309
I
Ibn Al Nafis 35 K
Ice-pick surgery 253 Kitab-Sushrud 4
Ig Nobel Prizes 335 Koch’s postulates 146
Ignaz Semmelweis 134,135 Kuntscher and Ilizarov 373
Influenza pandemic 1918 218
Ingenious investigations on digestive L
physiology 68
Laennec’s stethoscope 89
Insulin pump therapy 402
Leeuwenhoek’s microscope 64
Insulin shock therapy 259
Intra-aortic balloon pump 292 Leonardo da Vinci 47
Intraocular lens (IOL) transplant Leopold Auenbrugger 72
surgery 359 Lion incubator 397
Invention of sphygmomanometer Louis Pasteur 136
203 germ theory of disease 136
Isaac Judaeus 43 liquid growth medium 137
Ivan Petrovich Pavlov 208 Love-Sick woman 131

J M
Jean-Martin Charcot 109, 113 Marcello Malpighi 62
Jivraj Mehta 172 Marie curie 201
Index | 431

Matthew Baillie 76 Rodney R Porter 346


Medical marvels of 21st century Rolf Zinkernagel 346
401 Rosalyn S Yalow 346
Medical missionaries 167 Susumu Tonegawa 346
Medieval medicine 25 Nobel prize winners in physiology
Metrazol–shock therapy 259 or medicine 182
Miasma theory of disease 132,133
Motor homunculus 301
O
Mythological gods and goddesses in
ancient western medicine Organotherapy 124
11 Otto Loewi 262

N P
19th century operation theater 129 Pandit Madhusudan Gupta 170
Nitric oxide: from menace to marvel Paracelsus 49
of the decade 302 Paul Ehrlich 147
Nobel laureates in immunology 344 Pioneer neurohistologists 198
Baruj Benacerraf 346 Pioneers in heart surgery 272
Cesar Milstein 346 Plaster of Paris 372
Charles Richet 344
Plastic and reconstructive surgery
Daniel Bovet 345
5
George Kohler 346
Population explosion: an impact of
Gerald M Edelman 346
better health care 324
Jean Dausset 346
Prefrontal lobotomy 260
Jules Bordet 344
Psychotropic drugs 116
Karl Landsteiner 344
Macfarlane Burnet 345
Q
Max Theiler 345
Niels K Jerne 346 Quotations by Bernard 121
Peter B Medawar 345 Quotations of Jean-Martin Charcot
Peter Doherty 346 112
432 | History of Medicine

Quotations of sydenham 60 Ronald Ross 215


on gout 60 Rudolf Virchow 98
on medical practice 61
S
R
Scorpion venom–a diagnostic tool
Radial pulse 10
Ramalingaswami 307 406
Reconstructive rhinoplasty 5 Selman Waksman 229
Red cross movement 153 Senile dementia 103
Renaissance of medicine 38 Sigmund Freud 115
Richard Axel 319 Sir Joseph Lister 140
Rishi Atreya’s school 2 Soldier with tetanus 96
Rishi Dhanvantari’s school 2 Some famous neurophysiologists
Riva-Rocci sphygmomanometer 289
204 Stephen Hales 66
Robert Graves 97 Sushrut Samhita 4
Robert koch 143 Syphilis 81
anthrax 144
father of bacteriology 145
founder of the science of T
bacteriology 143 Talk therapy 116
glass slides with stained bacteria Telemedicine 403
145 Test tube babies (in vitro
microscope 145 fertilization) 304
test-tube with culture media Thalidomide disaster 280
145
The first woman doctors in the USA
tissue samples 145
and England 155
Robotic surgery 404
Role of battle-fields in medical Theodor Billroth 162
research 236 Thomas Sydenham 59
Role of criminals in the development Thomas Young 349
of anatomy 75 Traditional Chinese medicine 10
Index | 433

Transplant leg 32 W
Treatise on mediate auscultation 88 Walter Connon 248
Treatment of psychological Walter Freeman 253
disorders 256 Walter Rudolf Hess 250
Tropical medicine–a byproduct of Werner Forssmann–the first cardiac
imperialism 174 catheterization 270
Truth serum 317 Wet nursing and artificial feeding
398
U Wilhelm Conrad Roentgen 195
Wiliam Withering 79
Uroscopy–the ultimate diagnostic
Willem J Kolff 292
investigation 43
William Halsted 165
William Harvey 55
V William Morton 161
Venous valves 57
Volta’s pile 266 X
Von Graefe 354 X-ray studio 197

Potrebbero piacerti anche