Sei sulla pagina 1di 100

Inventions and Discoveries

Electromagnetic waves..................................................................3
The Wireless..................................................................................5
Television.....................................................................................12
Camera.........................................................................................15
Electricity....................................................................................17
Blood groups...............................................................................19
Printing .......................................................................................21
Bacteria........................................................................................23
Cells.............................................................................................28
Antibiotics...................................................................................30
Petroleum.....................................................................................32
Oxygen........................................................................................34
Refrigerator..................................................................................39
Pencil and Pen.............................................................................43
Computer.....................................................................................44
Electric lamp................................................................................47
Automobiles................................................................................50
Electric battery.............................................................................52
Loudspeaker................................................................................54
Microphone..................................................................................56
Microwave oven..........................................................................58
Airplane.......................................................................................64
Laser............................................................................................67
Vaccination..................................................................................70
Clocks and watches.....................................................................72
Wheel...........................................................................................74
Glass............................................................................................80
Portland Cement..........................................................................82
Bicycle.........................................................................................84
....................................................................................................86
Iron..............................................................................................87
Toothpaste....................................................................................90
Thermometer...............................................................................91
Soap.............................................................................................94
Cinema.........................................................................................96
Tape recorder...............................................................................98
Electromagnetic waves
In 1831, a British scientist Michael Faraday
discovered that changing electric current in a coil of
wire can induce a current in a nearby coil. The
current induced in the second coil is proportional to
its number of turns. James Clerk Maxwell, a
compatriot of Faraday, was a theoretician. A
theoretician is a scientist who does not work with
instruments or devices rather he dabbles with
mathematical formulations of observations. In 1865,
as a result of his studies, he discovered the
mechanism of interaction between electricity and
magnetism. He suggested that a change in electric
current can start a train of waves, the
electromagnetic waves, that radiate into space just
like light waves. According to him, the only
difference between a light wave and an
electromagnetic wave is a characteristic of waves-
the wavelength. Not all scientists accepted
Maxwell's ideas; after all there was no proof of the
existence of electromagnetic waves. The Berlin
Academy of Science offered a prize to anyone who
could prove that electromagnetic waves exist. In
1879, Heinrich Rudolf Hertz, a German scientist
took the challenge in 1886.
Hertz knew the work of Faraday. He devised a
simple experimental setup made up of two devices.
The first device had two coils placed near one other.
He passed electric current from a battery into the
first wire coil. The second coil had many more turns
than the first coil. As per the discovery of Faraday
the voltage developed in the second coil was much
higher than that of the battery. This current was led
to a pair of capacitors. (A capacitor is a pair of
metal plates that can accumlate electricity until
they can hold no more.) As soon as the capacitors
were charged to their capacity they discharged by
sending an electric spark between two small
metallic balls. The second device had similar balls
connected to a wire that was bent into circle and it
was placed at a distance from the first device. He
demonstrated that whenever an electric spark was
generated in the first device a spark can be
observed in the second device also, even though
the two were not connected through any wires. The
only way these two devices could communicate
with one another was through electromagnetic
waves. This proved Maxwell's ideas.
The Wireless
After it was discovered that out of various
electromagnetic waves, only those having
wavelength more than a meter could be used for
remote wireless communication. For example, light
waves could not be used for communication
because most common objects obstruct them. They
cannot pass through a wall of a building.
Electromagnetic waves that can go across walls and
hence can be used for long distance communication
are called radio waves. They can be transmitted
without wires or through wires, just like electricity. It
was also found that radio waves having nearly
equal wavelength interfere with one another, if
received simultaneously at a particular location.
Therefore radio waves of a particular wavelength
can be used to communicate to people at a
particular location only if nobody else is
transmitting radio waves of the same wavelength.
Many inventors in different countries tried
simultaneously to invent a communication device
using radio waves. For example, in 1893 a scientist
born in Hungary, Nikola Tesla, made the first public
demonstration of such a system. He described and
demonstrated in detail the principles of radio
communication. The apparatus that he used
contained almost all the elements that were used
later. In 1894, an Indian scientist, Jagdish Chandra
Basu, also demonstrated publicly the use of
electromagnetic waves in Kolkata. He was not
interested in patenting his work, so his work is not
recognized internationally. In the same year a
British physicist, Sir Oliver Lodge, demonstrated the
reception of Morse code signalling using radio
waves with the help of a detecting device -- a
coherer. This coherer was a tube filled with iron
filings. It was invented by an Italian, Temistocle
Calzecchi-Onesti, in 1884 to drain off electricity
during lightening. Edouard Branly of France and
Alexander Popov of Russia later produced improved
versions of the coherer. Many people claim that
Popov was the first person to develop a practical
communication system.
The inventor who is generally recognized as the
inventor of wireless telegraph is Gugliemo Marconi,
an Italian. He began by building an apparatus
similar to the one used by Hertz. He added a
telegraph key to the spark generator, so that he
could send signals corresponding to the dots and
dashes of the Morse code. To check whether it was
a practical communication device Marconi moved
his appratus outdoors to try its transmission-
reception over long distances. During these
experiments he made a lucky discovery: When one
terminal of the generator and receiver were
connected to the ground, communication was
possible across longer distances. He also discovered
the need for antenna (aerials); they transmit signals
from the transmitter to space and from space to the
receiver equipment. By 1895, Marconi had
developed a device with which he could send
signals across a few kilometers. Marconi got a
patent for his inventions in 1896, the world’s first
patent for “radio communication”. After patenting
his invention Marconi established a company called
Marconi’s Wireless Telegraph Company in London. In
1898 Marconi successfully transmitted signals
across the English Channel. The most dramatic use
of wireless was for rescuing ships in distress.
Several ships were equipped with wireless
telegraphy equipment, they could send or receive
distress messages from ships sailing nearby.
Till the beginning of the twentieth century, wireless
communication was limited to telegraphy. Many
people dreamt of wireless telephony at that time,
but the technology to achieve that was not
available.
Sound waves are continuous waves, their frequency
is much lower than that of electromagnetic waves.
(Frequency is
another
characeristic of a
wave very closely
related to its
wavelength). For
wireless
communication a
sound signal has to
be converted into a
radio wave. It was
soon found that any
electric signal can
be carried on a radio
wave (modulation of electromagnetic waves). All
that was necessary for wireless communication of
sound was an equipment that could generate
electric current having frequency of the radio
waves. Several inventors invented such devices.
The most notable amongst them was Nikola Tesla,
who invented the alternating current and Ernst
Alexanderson who built the first alternator that
could produce alternating current having frequency
about 50 thousand cycles.
Although the exact time when the human voice was
first transmitted by radio is debateable, it is claimed
that speech was first transmitted across the
American continent, from New York City to San
Francisco, in 1915. During the First World War
radiotelephony between ground and aircraft was
also tried. The first ship-to-shore two way radio
conversation occurred in 1922. However, a public
radiotelephone service for people at sea was
inaugurated in 1929. At that time telephone contact
could be made only with ships within 2000 km of
shore. Today every large ship wherever it may be on
the globe can be contacted using wireless
equipment.
The Telephone
The success of telegraph by 1874 enthused several
young minds in Europe and America. People started
dreaming about the possibility of talking through
wires, but no body knew how voice could be
converted into electric current and vice versa. One
such young man was Alexander Graham Bell. His
father was a speech teacher who had worked out a
system called visible speech. This system used
symbols to represent all of the sounds that people
make while speaking. He hoped to use this "sound
alphabet" for teaching the art of speaking to deaf
people. Deaf people have trouble speaking clearly
because they cannot hear what they are saying.
Young Alexander Bell was fascinated by his father's
work. When he was sixteen years old his father
challenged him to build a machine that could make
speech sounds. He therefore studied the larynx, the
voice-producing organ, of a lamb. Soon he
developed a voice box that made different sounds
using levers. He also studied how the mouth
changes shape while making vowel sounds. From
books he came to know that a learned German
scientist, Herman von Helmholz, had used
electrically operated tuning forks to reproduce
certain sounds of human speech.
Graham Bell started his efforts in the direction of
the invention of telephone by attempting to develop
a "harmonic telegraph", a device that would allow
several telegraph operators to send messages on
the same wire at the same time. Thus he developed
an idea for the telephone. By October 1874, Bell's
research had progressed to the extent that he could
inform his future father-in-law, Gardiner Greene
Hubbard, about the possibility of a multiple
telegraph. Hubbard resented the absolute control
on telegraph services exerted by the Western Union
Telegraph Company in USA at that time. He
instantly saw in the Bell's efforts a potential for
breaking such a monopoly, so he gave Bell the
financial backing he needed. Bell proceeded with
his work on the multiple telegraph. But he did not
reveal to Hubbard that he and Thomas Watson, a
young electrician whose services he had enlisted,
were also exploring an idea that had occurred to
him that summer. The idea was to develop a device
that would transmit speech electrically. They were
working on a device that used steel reeds that could
be set in vibration by electromagnets. One day
Watson tightened an adjustment screw of his device
a little too much. This prevented the reed from
vibrating, so he plucked the reed to try to set it in
motion again. Bell sitting in another room next to
his instruments heard a sound coming from the
reeds in the device near him. He rushed to Watson
to find how it happened. What excited him the most
was the fact the sound was not produced by an on
and off electric current as was the case with electric
telegraph it was a continuous sound. Soon
thereafter Bell experimented with vibrating
membranes instead of reeds. He was prompted to
do so by his knowledge of the human ear. Within a
few weeks he was successful in transmitting the
sounds of human voice through system that was
composed of a microphone and a speaker. The
microphone was like a funnel. One end open the
other end pointing to a membrane connected to a
rotor that had to follow the vibrations of the
membrane. This vibrating rotor was connected to a
coil to induce an electric current that could
reproduce the voice sent into the funnel. Bell's
microphone changed sound waves into an electric
current whose intensity changed quickly. The
electric current can travel much faster and it is
easier to transmit it across long distances than
sound.
Graham Bell was not the only person who was
trying on such an idea. Another American inventor
Elisha Gray was working on similar lines. In fact he
also smelt success just at the same time. But on
February 14 1876 when Bell's father in law filed an
application for the preliminary patent of Bell's
invention, Elisha Gray was just a few hours too late.
Nevertheless Bell had to face many problems
similar to the ones faced by many other inventors
at that time. Nobody was initially interested in his
invention. When he offered his patent for 100,000
American Dollars, the response was "What shall we
do with a toy like that?" This occurred in 1877. The
telephone invented by Graham Bell was not
immediately accepted for conversation, it was more
commonly used to send and listen to music. But
after some improvements it became popular for
conversations.
Television
Television was not invented by one inventor, many
inventors from various parts of the world
contributed. Therefore, its story is to be told slightly
differently.

Television, in a way, is an extension of our sense of


vision. The process of televising a visual consists of
three steps. Seeing it through a camera;
transmitting it to remote places and finally
producing its image on the screen of a TV. The
camera used for television is different from a
photography camera. A photography camera cannot
be used for televising because it does not produce
any electrical signals. Finding a method to convert
the image of a visual into a electric current was
indeed the first challenge for the potential inventors
of television. The discovery that led to the invention
of television was the discovery of the chemical
element “Selenium”. A Swedish scientist, Jacob
Berzelius, discovered it in the early nineteenth
century. It produces an electric current when light
falls on it and is therefore called a photosensitive
element (photo = light). This discovery led to
invention of several devices that could convert an
image into an electric current and reproduce the
image. One such device was invented by a German
engineer, Paul Nipkow, in 1884. This device “the
Nipkow’s disk” was an electromechanical device,
and hence was not very successful.

For the later developments it was necessary to


know what exactly is an electric current. Nobody
knew it till 1897! Electric current became known to
be a flow of electrons after an English scientist, J.J.
Thomson, discovered electrons -- the tiny negatively
charged particles in atoms. The invention of
cathode ray tube (CRT) by a German scientist, Karl
Ferdinand Braun, was perhaps crucial for the
development of an electronic television. In 1897,
after electrons had been “discovered” Braun, like
many other scientists of the day was intrigued.
Braun discovered that a stream of electrons
emanating from a negatively charged electrode
(cathode) inside a glass tube from which most of
the air had been removed—a cathode ray tube
(CRT)—could be focused to a point at the end of the
tube. If the end of the tube were coated with a
fluorescent material, it would glow wherever the
stream of electrons hit it. Braun also used a magnet
outside the tube, which interacted with the
electrons in the beam to move it back and forth. By
moving the magnet, he could trace patterns on the
screen. Braun also used a magnet outside the tube,
which interacted with the electrons in the beam to
move it back and forth. By moving the magnet, he
could trace patterns on the screen.

It was later discovered that an image


projected/focussed on the screen of a CRT can be
scanned, read like the text written on a paper, by
moving its beam over each point of the image.
Cathode rays were moved using electromagnets
because it was already known that the strength of
an electromagnet can be varied by changing the
electric current flowing through it. Scanning an
image to produce electric signal was therefore now
possible. Inventors tried coating the screen of a CRT
with selenium and found that the characterstics of
electric current
produced depends on
the image focussed
on the screen. The
first electronic device
that was close to the
modern TV was
invented by a Russian
inventor, Vladimir
Zworykin, in 1923. He called it ‘iconoscope’, it laid
the foundations for early television cameras.

However, the inventor who is most often credited


for the invention of television is John Logie Baird, a
Scottish engineer. He achieved the transmission of
simple face shapes in 1924 using Nipkow’s disc.
Baird demonstrated 'television' publicly in London
on March 25, 1925.

The details of a scene


in front of the camera
can be transmitted
either through
wireless transmitters (
very similar to those
used for broadcasting
sounds) or through a
cable—like telephone.
An image of the
scene can be produced on the screen of a television
set by feeding into its CRT the electrical signal
received. The main difference between the picture
tube of a television set and that used in a television
camera is the coating on their screens, while a
photoconducting material, say, selenium, is used for
coating the screen of the CRT inside a camera,
chemicals known as phosphors were (and are still)
used on the screen of a TV. A small dot of phosphor
produces a dot of light when cathode rays fall on it.
This light lasts only a fraction of second. Such light
dots produce a transient image on the screen. A
sequence of such transient images produces a
movie.
Camera
The earliest form of a camera is often called
"Camera Obscura". A Chinese philosopher Mo-Ti
(5th century BC) was perhaps the first person to
mention this type of device. He formally recorded
the creation of an inverted image formed by light
rays passing through a pinhole into a darkened
room. He called this darkened room a "collecting
place" or the "locked treasure room." Aristotle (384-
322 BC), a famous Greek philosopher, also
understood the optical principle of the "camera
obscura". He viewed the crescent shape of a
partially eclipsed sun projected on the ground
through the holes in a sieve, and the gaps between
leaves of a plane tree.
The earliest "Camera Obscuras" were large rooms
that were used to observe a solar eclipse. A convex
lens was used into the aperture in the 16th century
to improve the image quality; a mirror was added to
reflect the image down onto a viewing surface. This
device was often used as an aid for drawing for
artists. Soon thereafter, in 1807, another kind of
camera known as "Camera Lucida" was invented.
No darkroom was needed for this kind of camera.
The paper was laid flat on the drawing board, and
the artist would look through a lens containing the
prism, so that he could see both the paper and a
faint image of the subject to be drawn. He would
then fill in the image.
Obtaining a direct recording of an image that did
not require the skills of an artist was not possible till
certain chemical substances that changed their
properties when they are exposed to light became
known. A German scientist discovered in 1727 that
if he mixes three chemicals: chalk, nitric acid, and
silver in a flask, the side of the flask facing sunlight
gets darkened. In 1800, a scientist from England,
Thomas Wedgwood, made the first "sun pictures" by
placing opaque objects on leather treated with a
chemical called silver nitrate. However, these
pictures survived only under candles light, under
any stronger source of light they detoriated very
fast. It was not until 1826 when a French scientist
Nicéphore Niépce, combined the camera obscura
with photosensitive paper that it was possible to
obtain a permanent image. Soon thereafter, in
1834, another English scientist, Henry Fox Talbot,
used paper impregnated with silver nitrate or silver
chloride. When exposed in a camera, this paper
turned black where light struck it, creating a
negative image of the subject. This was made
permanent by fixing with hypo. The images so
obtained were of course only black and white. (The
story of development of photography is very aptly
detailed on the website
http://www.scphoto.com/html/history.html.)
These early development led to many other
discoveries and inventions that made it possible for
newspapers to carry photographs by 1880. Way
back in 1900 one could purchase a camera and
shoot pictures using photography films produced by
the company Eastman Kodak.
Electricity
A Greek philosopher Thales of Miletus, who lived
about 600 BC, is said to have discovered that amber
acquires a power to attract light objects when
rubbed. Another Greek philosopher, Theophrastus,
in a treatise written about three centuries later, told
that some other substances also possess this power.
Similarly, ancient Greeks as well as Chinese knew
magnets. But, it was not until AD 1600, when an
English physician William Gilbert studied both of
them in detail and his observations were available
in printed form, that the facts about electricity and
magnetism became widely known. Gilbert was the
first person to apply the term electric (Greek
elektron, "amber") to the force that such substances
exert after rubbing. He also distinguished between
magnetic and electric action.
The first machine for producing an electric charge
was invented in 1672 by a German scientist Otto
von Guericke. It consisted of a sulfur sphere turned
by a crank on which a charge was induced when the
hand was held against it. The French scientist
Charles Francois de Cisternay Du Fay was the first to
discover that there are two different types of
electric charge: positive and negative. The earliest
device to store electric charge, the Leyden jar, was
invented in 1745 at the University of Leiden in the
Netherlands. It consisted of a glass bottle with
separate coatings of tinfoil on the inside and
outside. One sensed a violent shock by touching
both coatings of the foil simultaneously.
Benjamin Franklin, an American scientist, spent
much time to study electricity. Through his famous
kite experiment he discovered that the atmospheric
electricity that causes the phenomena of lightning
and thunder is identical with the electrostatic
charge on a Leyden jar. Franklin suggested that
electricity is a "fluid" existing in all matter, and that
its effects can be explained by excesses and
shortages of this fluid.
Alessandro Volta, an Italian scientist invented the
first device capable of producing an electric current
(electricity), a battery. He found that if pieces of two
different metals were separated with a cardboard
disk soaked in brine (salt solution), an electric
current flows through the wires connected to these
metal pieces. In 1800, he announced a new
electrical device, the Voltaic Pile. This device was
made of alternating disks of zinc and copper with
each pair separated by brine soaked cloth. This was
the first battery.
In 1831 Michael Faraday, a British scientist
discovered the electromagnetic induction. This is a
method for producing a steady electric current.
Faraday attached two wires through a sliding
contact to a copper disc. By rotating the disc
between the poles of a horseshoe magnet he
obtained a continuous direct current. This was the
first electric generator; it led to the establishment of
the first electric power station in 1888.
Blood groups
Te true nature and function of blood have been
shrouded in mystery ever since the beginning of
history. Using human blood to treat disease and
trauma had its beginnings in 1667 in France, when
Jean-Baptiste Denis documented a direct human
blood transfusion. This was a scant forty years after
William Harvey discovered the circulatory system.
These early direct donor-to-patient transfusions
were, however, frequently disastrous because it
was not possible to predict donor-recipient blood
type compatibility.
Experiments with blood transfusions, the transfer of
blood or blood components into a person's blood
stream, have been carried out for hundreds of
years. Many patients died because of blood
transfusions. Thus while in 1818, when James
Blundell transfused blood to a woman from her
husband and it worked. But other patients died from
transfusions. It was decades later when a German
named Leonard Landois learned why blood mixing
can be fatal: Sometimes it makes red blood cells
clump and explode.
In 1901-1903 Landsteiner pointed out that a
similar reaction may occur when the blood of one
human individual is transfused, not with the blood
of another animal, but with that of another human
being, and that this might be the cause of shock,
jaundice, and haemoglobinuria that had followed
some earlier attempts at blood transfusions.
His suggestions, however, received little attention
until, in 1909, he classified the bloods of human
beings into three blood groups A, B, and C. These
eventually became known as A, B, and O. The rarer
group AB was not discovered until the following
year by two of Landsteiner's pupils. These groups
are now well-known as A, B, AB, and O groups.
Landsteiner showed that transfusions between
individuals of groups A or B do not result in the
destruction of new blood cells and that this
catastrophe occurs only when a person is
transfused with the blood of a person belonging to
a different group.
Karl Landsteiner's work made it possible to
determine blood types and thus paved the way for
blood transfusions to be carried out safely. For this
discovery he was awarded the Nobel Prize in
Physiology or Medicine in 1930. Later in 1940
Landsteiner and Weiner made observations which
laid the foundations of our knowledge about the
remaining major blood group - the Rhesus system.
Once reliable tests for Rhesus grouping had been
established, deaths due to blood transfusion
became rare.
Printing
Written language is unquestionably one of the most
important human achievements therefore the ability
to reproduce written materials quickly and
efficiently ranks not far behind. Only when written
works could be duplicated in quantities and speeds
exceeding those achievable through laborious
handwritten copies did writing become a medium
for the widespread dissemination of knowledge—the
more copies of material available, the more people
who have access to them, the more likely the
spread of literacy. The challenge, particularly in
civilizations with large, complex systems of writing,
was to develop a method for quickly and efficiently
arranging those symbols, using the arrangement to
create printed material, then re-arranging the
symbols for further use. Chinese printers were the
first to structure printing in a way that hinted at
mass-production in the 8th century. They used
wooden blocks with characters carved into them,
which were then inked and stamped on paper.
Extending the Chinese monopoly on printing, in the
11th century Pi Sheng created a primitive form of
moveable type (made of wood), which allowed for
the letters to be rearranged. In a neighboring
country Korea, moveable metal type was tried in the
early 15th century but it was not very successful
due to the large number of characters in Korean
script. In Europe printing developed a bit later. Till
the beginning of the 15th century, they followed the
method introduced by Chinese -- block printing.
As the methods for casting metals became known,
the invention of a machine to print became
possible. An innovator in Germany, Johann
Gutenberg spent over ten-years developing the
western-style moveable type. He then developed
a method using lead and tin alloys to mold
moving type for individual letters of the Roman
script. He also invented a machine, the printing
press that was based on the design of presses used
by farmers to make olive oil. The first printing press
used a heavy screw to force a printing block against
the paper below and the ink used was a mixture of
turpentine, lampblack and linseed oil. Invented by
1450 such a printing press made the mass
publication and circulation of literature easy and
economical. In the later models, as machines
became more popular, inking was carried out by
rollers. These rollers would pass over the face of the
type and move out of the way onto a separate ink-
bed to pick up a fresh film of ink. A sheet of paper
was slid against a hinged plate, which was rapidly
pressed onto the type and then swung back,
allowing it to be removed and the next sheet
inserted in its place.
Bacteria
Bacteria are the most abundant of all organisms.
They are ubiquitous in soil and water. Bacteria
consist of only a single cell, but they are an
amazingly complex and fascinating group of
creatures. When most people think of bacteria, they
think of disease-causing organisms. But that is
strictly not true. Several kinds of bacteria are
extremely helpful to us, they help us make
medicines or cook food.
Antony van Leeuwenhoek, the man who discovered
bacteria, was neither a physician nor a university
professor but a Dutch draper and part-time janitor
who liked to look at things under a microscope.
Leeuwenhoek came from a family of tradesmen,
had no fortune, received no higher education or
university degrees, and knew no languages other
than his native Dutch. This would have been
enough to exclude him from the scientific
community of his time completely. Yet with skill,
diligence, an endless curiosity, and an open mind
free of the scientific dogma of his day,
Leeuwenhoek succeeded in making some of the
most important discoveries in the history of biology.
It was he who discovered bacteria.
Leeuwenhoek's skill at grinding lenses, together
with his naturally acute eyesight and great care in
adjusting the lighting where he worked, enabled
him to build microscopes that magnified over 200
times, with clearer and brighter images than any of
his colleagues could achieve. What further
distinguished him was his curiosity to observe
almost anything that could be placed under his
lenses, and his care in describing what he saw.
Although he himself could not draw well, he hired
an illustrator to prepare drawings of the things he
saw, to accompany his written descriptions. Most of
his descriptions of microorganisms are instantly
recognizable.
In 1673, Leeuwenhoek began writing letters to the
newly formed Royal Society of London, describing
what he had seen with his microscopes -- his first
letter contained some observations on the stings of
bees. For the next fifty years he corresponded with
the Royal Society; his letters, written in Dutch, were
translated into English or Latin and printed in the
Philosophical Transactions of the Royal Society, and
often reprinted separately.
Rocket
Rocket is an indispensable tool in the exploration of
space. Today's rockets are remarkable collections of
human ingenuity that have their roots in the science
and technology of the past.
One of the first devices to successfully employ the
principles essential to rocket flight was a wooden
bird. The writings of Aulus Gellius, a Roman, tell a
story of a Greek named Archytas who lived in the
city of Tarentum, now a part of southern Italy.
Somewhere around the year 400 B.C., Archytas
mystified and amused the citizens of Tarentum by
flying a pigeon made of wood. Escaping steam
propelled the bird suspended on wires. The pigeon
used the action-reaction principle, which was not
stated as a scientific law until the 17th century.
About three hundred years after the pigeon,
another Greek, Hero of Alexandria, invented a
similar rocket-like device called an aeolipile. It, too,
used steam as a propellant.
Hero mounted a sphere on top of a water kettle. A
fire below the kettle turned the water into steam,
and the gas traveled through pipes to the sphere.
Two L-shaped tubes on opposite sides of the sphere
allowed the gas to escape, and in doing so gave a
thrust to the sphere that caused it to rotate.
No one is really sure when the first true rocket was
built. Stories of early rocket-like devices appear
sporadically through the historical records of various
cultures. Perhaps the first true rockets were
accidents. In the first century A.D., the Chinese
reportedly had a simple form of gunpowder made
from saltpeter, sulfur, and charcoal dust. To create
explosions during religious festivals, they filled
bamboo tubes with a mixture and tossed them into
fires. Perhaps some of those tubes failed to explode
and instead skittered out of the fires, propelled by
the gases and sparks produced by the burning
gunpowder.
Chinese began experimenting with the gunpowder
filled tubes. At some point, they attached bamboo
tubes to arrows and launched them with bows. Soon
they discovered that these gunpowder tubes could
launch themselves just by the power produced from
the escaping gas. The true rocket was born.
The date reporting the first use of true rockets was
in 1232. At this time, the Chinese and the Mongols
were at war with each other. During the battle of
Kai-Keng, the Chinese repelled the Mongol invaders
by a barrage of "arrows of flying fire." These fire-
arrows were a simple form of a solid-propellant
rocket. A tube, capped at one end, contained
gunpowder. The other end was left open and the
tube was attached to a long stick. When the powder
was ignited, the rapid burning of the powder
produced fire, smoke, and gas that escaped out the
open end and produced a thrust. The stick acted as
a simple guidance system that kept the rocket
headed in one general direction as it flew through
the air. Although one may not be sure how effective
these arrows of flying fire were as weapons of
destruction, but their psychological effects on the
Mongols was formidable.
Following the battle of Kai-Keng, the Mongols
produced rockets of their own and may have been
responsible for the spread of rockets to Europe. All
through the 13th to the 15th centuries there were
reports of many rocket experiments. In England, a
monk named Roger Bacon worked on improved
forms of gunpowder that greatly increased the
range of rockets. In France, Jean Froissart found that
more accurate flights could be achieved by
launching rockets through tubes. Froissart's idea
was the forerunner of the modern bazooka. Joanes
de Fontana of Italy designed a surface-running
rocket-powered torpedo for setting enemy ships on
fire.
By the 16th century rockets were mainly used for
fireworks displays, and a German fireworks maker,
Johann Schmidlap, invented the "step rocket," a
multi-staged vehicle for lifting fireworks to higher
altitudes. A large sky rocket (first stage) carried a
smaller sky rocket (second stage). When the large
rocket burned out, the smaller one continued to a
higher altitude before showering the sky with
glowing cinders. Schmidlap's idea is basic to all
rockets today that go into outer space.
Nearly all uses up to this time were for warfare or
fireworks, but there is an interesting old Chinese
legend that reported the use of rockets as a means
of transportation. With the help of many assistants,
a lesser-known Chinese official named Wan-Hu
assembled a rocket- powered flying chair. Attached
to the chair were two large kites, and fixed to the
kites were forty- seven fire-arrow rockets.
During the early introduction of rockets to Europe,
they were used only as weapons. Enemy troops in
India repulsed the British with rockets. During the
19th century, rocket enthusiasts and inventors
began to appear in almost every country. By the
end of the 19th century, soldiers, sailors, practical
and not so practical inventors got interested in
rocketry. Skillful theorists, like Konstantian
Tsiolkovsky in USSR, studied the science behind
rocketry. They also examined the possibility of
space travel. Three persons were particularly
significant in the transition from the small rockets of
the 19th century to the giant rockets of today. They
were: Konstantin Tsiolkovsky from Russia, Robert
Goddard from the United States, and Hermann
Oberth from Germany.
Cells
Most cells are too small, they cannot be observed
with the naked eye. It is for this reason that the
existence of cells escaped notice until scientists first
learned to harness the magnifying power of lenses
in the second half of the seventeenth century, that
is the invention of the microscope. A Dutch clothing
dealer named Antonie van Leeuwenhoek invented
the single-lens microscopes. Gazing into the lens of
these microscopes, he discovered single-celled
organisms, which he called "animalcules" and
which, today, we call bacteria and protists.
Englishman Robert Hooke expanded on
Leeuwenhoek’s observations with the newly
developed compound microscope, which uses two
or more aligned lenses to increase magnification
while reducing blurring. When Hooke turned the
microscope on a piece of cork, he noticed that the
tiny, boxlike compartments of the wood resembled
the cells of a monastery. The term 'cell' was born.
As microscope technology improved, scientists were
able to study cells in ever-greater detail. Hooke had
no way to tell if cells were living things, but later
researchers who could see the nucleus and the
swirling motion of the cytoplasm were convinced
that cells were indeed alive. By 1839, enough
evidence had accumulated for German biologists
Matthias Schleiden and Theodore Schwann to
proclaim that cells are “the elementary particles of
all biological organisms. But many scientists still did
not believe that cells arose from other cells until
1855, when a famous German scientist Rudolph
Virchow pronounced, "All cells come from cells."
Nearly 200 years after the discovery of cells, the
observations of Virchow, Schleiden, and Schwann
established the cell theory. According to this theory:
All living things are made of cells and all cells arise
from preexisting cells.
Antibiotics
The use of antibiotics is often considered one of the
wonders of the modern world. It has had dramatic
effects on the practice of medicine, the
pharmaceutical industry, and microbiology. Prior to
the discovery of antibiotics, the treatment of
infectious diseases was not very scientific. Various
types of antimicrobial agents, including extracts of
plants, fungi, and lichens, were employed for
thousands of years in primitive populations without
any scientific knowledge of what was being used.
Even in the early part of the twentieth century,
therapy for infectious diseases was based
essentially on patient isolation and chicken soup.
The search for antibiotics began in the late 1800s,
with the growing acceptance of the germ theory of
disease, a theory, which linked bacteria and other
microbes to the causation of a variety of ailments.
As a result, scientists began to devote time to
searching for drugs that would kill these disease-
causing bacteria. The goal of such research was to
find so-called “magic bullets” that would destroy
microbes without toxicity to the person taking the
drug.
One of the earliest areas of scientific exploration in
this field was whether harmless bacteria could treat
diseases caused by pathogenic strains of bacteria.
By the late 19th century there were a few notable
breakthroughs. In 1877, Louis Pasteur showed that
the bacterial disease anthrax, which can cause
respiratory failure, could be rendered harmless in
animals with the injection of soil bacteria. In 1887,
Rudolf Emmerich showed that cholera was
prevented in animals that had been previously
infected with the streptococcus bacterium and then
injected with the cholera bacillus.
In the early 1920s, the British scientist Alexander
Fleming reported that a product in human tears
could brealdown (lyse) bacterial cells. Soon he
made another discovery that changed the course of
medicine. In 1928, Fleming discovered another
antibacterial agent. He named this substance
penicillin after the Penicillium mold that had
produced it. By extracting the substance from
plates, Fleming was able to show its effects;
penicillin destroyed a common bacterium,
Staphylococcus aureus, associated with sometimes
deadly skin infections.
Petroleum
Petroleum is one of the most important sources of
energy today. But for petrol and diesel, fuels
extracted from petroleum travel would be a
nightmare. Petroleum is often food deep under the
Earth surface. The first oil wells to extract
petroleum were drilled in China in the 4th century
or earlier. They were up to 800 feet deep and were
drilled using bits attached to bamboo poles. The oil
was burned to evaporate seawater and produce
salt. By the 10th century, extensive bamboo
pipelines connected oil wells with salt springs.
Ancient Persian tablets indicate the medicinal and
lighting uses of petroleum in the upper echelons of
their society.
In the 8th century, the streets of the newly
constructed Baghdad were paved with tar, derived
from easily-accessible petroleum from natural fields
in the region. In the 9th century, oil fields were
exploited in Baku, Azerbaijan, to produce naphtha.
The geographer Masudi in the 10th century, and by
Marco Polo in the 13th century, described these
fields whose output was hundreds of shiploads.
The modern history of oil began in 1853, with the
discovery of the process of oil distillation. Crude oil
was distilled into kerosene by Ignacy Lukasiewicz, a
Polish scientist. The first "rock oil" mine was created
in Bobrka, near Krosno in southern Poland in the
following year and the first refinery (actually a
distillery) was built in Ulaszowice, also by
Lukasiewicz. These discoveries rapidly spread
around the world, and Meerzoeff built the first
Russian refinery in the mature oil fields at Baku in
1861.The first modern oil well was drilled in 1848 by
a Russian engineer F.N. Semyenov, on the Aspheron
Peninsula north-east of Baku.
By 1910, significant oil fields had been discovered
in Canada (specifically, in the province of Alberta),
the Dutch East Indies (1885, in Sumatra), Persia
(1901, in Masjed Soleiman), Peru, Venezuela, and
Mexico, and were being developed at an industrial
level. The Indian petroleum industry dates back to
1890 when oil was first struck at Digboi in
northeastern India. However, the most significant
discovery of petroleum in India was that at Bombay
High, on 19th February 1974, which, in reality, was
the turning point in history of oil exploration in
India.
Oxygen
Oxygen supports all life on Earth and is essential for
combustion and respiration, yet its very existence
was not known until the 1770's, when scientists
began to concern themselves with air and how it
affects combustion. The ancient Greeks considered
air to be an element composed of a single
substance, and this view persisted through the
centuries. The discovery of oxygen, its significance,
and the capability for measuring it accurately
required some major scientific breakthroughs.
In 1770, G.E. Stahl, a German physician, proposed a
theory that received widespread acceptance. He
claimed that all inflammable objects contained a
material substance that he called "phlogiston," from
a Greek word meaning "to set on fire." When an
object burned, it poured its content of phlogiston
into the air, and when all its phlogiston was gone, it
stopped burning. Wood lost its phlogiston very
rapidly, so that its passage into air was visible as
flames. Stahl suggested that the rusting of metals
also depended on the loss of phlogiston to
surrounding air, except that metals lost their
phlogiston so slowly that rusting was a gradual
process.
Experiments to learn more about the principles of
combustion were made in 1772 by a Scottish
chemist, Joseph Black, and his student, Daniel
Rutherford. They tried burning candles in closed
containers of air and found that the candles
eventually went out even though the containers still
held a large amount of air. Mice put into these
containers promptly died. Holding to the phlogiston
theory, Rutherford came to the conclusion that the
burning candles emitted phlogiston but a given
volume of air could hold only a certain amount of
phlogiston. When the saturation point was reached
in the closed container, the air would not accept
any more phlogiston and the candle would go out
because it could not continue to emit phlogiston.
Rutherford believed that, in like manner, a living
creature gives up phlogiston while breathing and
when placed in air that is already saturated with
phlogiston, can no longer breathe and must die.
The demolition of the phlogiston theory began with
experiments carried on in 1774 by Joseph Priestly,
an English clergyman who was interested in
science. His experiments involved the heating of
mercury by exposing it to sunlight concentrated
through a magnifying glass. The heated mercury
became coated with a reddish powder, which
Priestly reheated at a higher temperature. The
powder evaporated and turned into two gases, one
of which was mercury vapor. The mercury vapor
condensed into drops of mercury in the test vessel
as it cooled. The other gas was invisible, but
Priestly knew it existed because when he placed a
smoldering splint of wood into it, the wood burst
into flame, and mice put into this invisible gas
became hyperactive. Priestly stuck to the
phlogiston theory to explain these results. He
thought that heated mercury lost some of its
phlogiston and turned into mercury rust. When this
rust was heated, it absorbed phlogiston from air and
turned back into mercury. The invisible gas had
also lost its phlogiston, and it drew phlogiston
rapidly from the smoldering wood splint, causing
the splint to burst into flame.
Priestly later traveled to Paris, where he discussed
his experiments with a brilliant French chemist,
Antoine Lavoisier, who had been carrying on his
own experiments in combustion. Lavoisier's
experiments had convinced him that phlogiston did
not exist and combustion was caused by the
combination of fuel with air. However, he was
unable to prove his theory until Priestly described
his experiments with heated mercury. Lavoisier had
burned candles in closed containers and he had
observed that only one-fifth of the air was
consumed during burning and the remaining four-
fifths would not support combustion. After his
discussions with Priestly, Lavoisier realized that
what Priestly called two different kinds of air - one
with phlogiston and one without - was really only
one kind of air that contained two substances.
Lavoisier called the one-fifth of the air that
supported combustion "oxygen" (from the Greek
words meaning "acid-producing," because he
thought (wrongly) that oxygen was a necessary
component of all acids). The four-fifths of the air
that does not support respiration or combustion he
called "azote" (from the Greek words meaning "no
life"). Azote today is known as "nitrogen”.
Paper
Nobody knew paper 1900 years ago! A Chinese,
T'sai Lun, invented paper in 105 AD. He
experimented with a wide variety of materials and
refined the process of macerating the plants fibers
until each filament was completely separate. The
individual fibers were mixed with water in a large
vat and then a screen was submerged in the vat
and lifted up through the water, catching the fibers
on its surface. When dried, this thin layer of
intertwined fiber became paper. T'sai Lun's thin, yet
flexible and strong paper with its fine, smooth
surface was known as T'sai Ko-Shi , meaning:
"Distinguished T'sai's Paper" and he became the
patron "saint of papermaking". It took about a
hundred years for the use of paper to spread across
central Asia. Books followed soon after. To produce
books required printing. The very first books were
printed in China, by stamping of seals (something
like rubber stamps used nowadays) on paper.
The utility of books prompted people to improve the
technique of making paper. The pioneers of this
venture were mostly the Asians. Japanese
discovered a method to make paper from waste
paper. Egyptians used cloth rags to make paper.
This knowledge gradually made its way to the
western countries through the Muslim world - to
Baghdad, Damascus and Cairo and ultimately to
Europe in the 12th century.
The Europeans quickly grasped the merits of
printing on paper. They improvised methods for
making paper on a large scale. The earliest paper in
Europe was made from recycled cotton and linen.
This was an impetus for the trade of old rags. When
this source became insufficient curious attempts
were made to source new materials - the most
macabre of which was the recycling of Egyptian
mummies to create wrapping paper! They also
experimented with fibers such as straw, cabbage,
wasp-nests and finally wood. Ultimately this quest
ended when inexpensive and replaceable materials
for papermaking-the long soft fibers of softwoods
such as spruce, were discovered. A paper mill, that
is an industry to produce paper on a large scale,
was established for the first time in England in the
year 1495.
Refrigerator
Refrigeration is the process by which heat from an
enclosed space, or from a substance is removed
and transferred to lower its temperature. A
refrigerator uses the phenomenon of evaporation of
a liquid to absorb heat. The liquid, or refrigerant,
used in a refrigerator evaporates at an extremely
low temperature, creating freezing temperatures
inside the refrigerator.
Before the invention of refrigerator, people cooled
their food with ice and snow, either found locally or
brought down from the mountains. The first cellars
were holes dug into the ground and lined with wood
or straw and packed with snow and ice: this was the
only means of refrigeration for most of history.
Refrigeration involves the following process: - a
liquid is rapidly vaporized (through compression) -
the quickly expanding vapor requires kinetic energy
and draws the energy needed from the immediate
area - which loses energy and becomes cooler.
Cooling caused by the rapid expansion of gases is
the primary means of refrigeration today.
William Cullen at the University of Glasgow was
perhaps the first inventor who demonstrated a
refrigerator in 1748. However, he did not use his
discovery for any practical purpose. In 1805, an
American inventor, Oliver Evans, designed the first
refrigeration machine. Jacob Perkins built the first
practical refrigerating machine in 1834; it used
ether in a vapor compression cycle. An American
physician, John Gorrie, built a refrigerator based on
Oliver Evans' design in 1844 to make ice to cool the
air for his yellow fever patients. German engineer
Carl von Linden, patented not a refrigerator but the
process of liquifying gas in 1876 that is part of basic
refrigeration technology.
Till about 1929 refrigerators used the toxic gases
ammonia (NH3), methyl chloride (CH3Cl), and sulfur
dioxide (SO2) as refrigerants. Several fatal
accidents occurred in the 1920s when methyl
chloride leaked out of refrigerators. Three American
corporations launched collaborative research to
develop a less dangerous method of refrigeration;
their efforts lead to the discovery of Freon. In just a
few years, compressor refrigerators using Freon
became the standard for almost all home kitchens.
Only decades later, would people realize that these
chlorofluorocarbons endangered the ozone layer of
the entire planet.
The Cell phone
The basic concept of cellular phones began in 1947,
when some engineers in USA looked at crude
mobile (car) wireless phones that were used in USA
at that time. The number of the users of such
phones was very small because the number of
frequencies available for them was limited. It was
realized that if transmission of radio waves was
limited to a small area, small cells a frequency can
be reused in another remote cell, thus increasing
the traffic capacity of mobile phones substantially.
However at that time, the technology to do so was
nonexistent.
In USA, anything to do with wireless communication
is decided by a department known as Federal
Communications Commission (FCC). Since a cell
phone is a type of two-way radio, in 1947, an
American company AT&T proposed that the FCC
allocate a larger number of frequencies of
electromagnetic waves capable of radio
communication to make mobile telephone service
feasible. But FCC declined this request.
This position was reconsidered in 1968. AT&T and
'Bell Labs' then proposed the present form of
cellular system. In this system many small, low-
powered, broadcast towers, each covering a 'cell' a
few kilometers in radius collectively cover a large
area. Each tower uses only a few of the total
frequencies allocated to the system. As the phones
travel across the area, calls are passed from tower
to tower.
Dr Martin Cooper, a general manager at an
American company 'Motorola', is considered the
inventor of the portable cellphone handset. Cooper
made the first call on a portable cell phone in April
1973. He made the call to his rival, Joel Engel.
Motorola was the first company to incorporate
technology into portable device that was designed
for use even outside of an automobile. By 1977,
AT&T and Bell Labs had constructed a prototype
cellular system.
Pencil and Pen
The Greeks introduced the earliest instrument of
writing that approached the pen. They employed a
writing stylus, made of metal, bone or ivory, to
place marks upon wax-coated tablets. The tablets
made in hinged pairs, closed to protect the scribe’s
notes. Thus the first examples of handwriting
(purely text messages made by hand) originated in
Greece. A scholar from Greece, Cadmus invented
the written letter - text messages on paper sent
from one individual to another.
The instrument used for writing that dominated for
the longest period in history (over one-thousand
years) was the quill pen. Introduced around 700
A.D., the quill was a pen made from
a bird feather. The strongest quills
were those taken from living birds
in the spring from the five outer left
wing feathers. The left wing was
favored because the feathers
curved outward and away when
used by a right-handed writer.
Goose feathers were most
common; swan feathers were of a
premium grade being scarcer and
more expensive. For making fine lines, crow
feathers were the best, followed by the feathers of
the eagle, owl, hawk and turkey.
Pencil was most likely invented in England, after
some shepherds in Borrowdale found small pieces
of a charred oak tree that had fallen during a storm,
useful for marking sheep, sometime in 1564. Soon
thereafter small pieces of this material were
encased in wood to produce a sturdy and clean
writing instrument that needed no ink. Many people
have wondered why the core of a pencil is called
“lead”. The answer perhaps lies in the fact that
Greeks and Romans used small disc shaped pieces
of lead to write, way back in 20 B.C. That’s why
material discovered by shepherds was initially
known as “plumbago” (imitation Lead), until a
Swedish scientist, W. Scheele, found it to be a form
of carbon and gave it the name “graphite” (from the
Greek word “Graphis” for writing). Fountain pens
and the ballpoint pens came much later, the earliest
surviving fountain pens date to the early 18th (or
possibly later 17th) century; they are made of
metal, or cut quills used as nibs. From the beginning
of the 19th century, the number of fountain pen
designs patented and produced began to multiply.
Three major advances paved the way for the
fountain pen’s widespread acceptance: the
invention of hard rubber (a naturally-derived plastic,
resistant to chemicals, easily machined, and
relatively cheap); the availability of iridium-tipped
gold nibs; and improved inks, not laden with
clogging sediment. But, all these three factors fell
into place later, sometime around 1870 - 1880.

Computer
The story of invention of computer differs from the
story of invention of television. It was invented not
by any individual, rather through large commercial
establishments. Several groups of people working
for a large business houses made it possible. The
earliest computer was somewhat like a
programmable calculator; it could only make
mathematical calculations. A German inventor,
Konrad Zuse, is often credited with the invention of
the first electronic computer is. He made the world's
first electronic, fully programmable digital computer
in 1941, with recycled materials donated by his
colleagues in university. Five years later in 1946,
John Mauchly and J Presper Eckert developed the
ENIAC I (Electrical Numerical Integrator And
Calculator) under a project sponsored by the U.S.
military. This computer covered 167 square meters
of floor space, weighed 30 tons, and consumed 160
kilowatts of electrical power. In one second, it could
perform 5,000 additions, 357 multiplications or 38
divisions.
Later computers became significantly smaller. This
became possible due to the invention of a device,
the transistor. Three American scientists, John
Bardeen, William Shockley, and Walter Brattain,
invented transistor while working for the Bell
Telephone Laboratories in U.S.A. (A business house
established by Graham Bell -- the inventor of
Telephone). They invented it accidentally while
studying the behavior of crystals of germanium to
find something to replace vacuum tubes as
mechanical relays in telecommunications. The
vacuum tubes, used at that time in various devices
for communication, consumed lots of electricity and
produced unnecessary heat. A transistor is made
from semi-conductor materials. A semiconductor
material is a kind if material that can conduct
electricity as well as stop its flow (insulator).
Chemical elements germanium and silicon are two
examples of semiconductor materials. A transistor is
the first device discovered to be capable of acting
as a transmitter, converting sound waves into
waves of electric current, and a resistor, controlling
electric current. No doubt transistors soon replaced
vacuum tubes in the computers. Computers made
up of transistors were more reliable and consumed
much less electricity.
The next step was the integration of many electric
devices into a tiny small crystal of silicon, an
integrated circuit (IC). Till 1959, it was believed that
to make a computer more efficient it is necessary to
increase the number of electrical components in it.
After the invention of integrated circuits, hundred of
transistors, resistors, capacitors and connecting
wires, could be put into a single component -the
chip. A chip is made on a single crystal of a
semiconductor material. The technology for making
an IC was invented by two American engineers Jack
Kilby, working for a company named Texas
Instruments and Robert Noyce, the co-founder of
the Fairchild Semiconductor Corporation. Further
development of computer was due to the
development of an IC specifically designed for
computers. In 1971, a company ‘Intel’ introduced a
microprocessor as an IC, the Intel 4004. Three
employees of Intel are said to be responsible for the
invention of this chip: Federico Faggin, Ted Hoff, and
Stan Mazor. In this IC all the parts that made a
computer think (i.e. central processing unit,
memory, input and output controls) are on a single
chip.
The person who can perhaps be called the inventor
of personal computers is Douglas Engelbart. He
invented or contributed to several interactive
devices and features: the computer mouse,
windows, computer video teleconferencing, email,
the Internet and more. However, the real revolution
in PC was the handiwork of a few computer
whizkids: Bill Gates and Steve Jobs who developed
software that is really user friendly.
Electric lamp
The story of the invention of electric lamp goes
back to 1811, when Sir Humphrey Davy discovered
that an electrical arc passed between two poles
produced light. In 1841, experimental arc lights
were installed as public lighting along the Place de
la Concorde in Paris. Other experiments were
undertaken in Europe and America, but the arc light
eventually proved impractical because it burned out
too quickly. Inventors continued to grapple with the
problem of developing a reliable electric light that
would be practical for both home and public use as
a viable alternative to light from burning gas.
However, the practical solution for producing light
from electricity lay not in an electrical arc in open
space, rather in electricity passed through a
filament. The breakthrough theory became known,
as the Joule effect after James Prescott Joule, who
theorized that electrical current, if passed through a
resistant conductor, would glow white-hot with heat
energy turned to luminous energy. The problem was
devising the right conductor, or filament, and
inserting it in a container, or bulb, without oxygen
because the presence of oxygen would cause the
filament to burn out.
Sir Joseph Wilson Swan an English inventor was the
first person to construct an electric light bulb, but
he had trouble maintaining a vacuum in his bulb.
Thomas Alva Edison the legendary American
inventor solved this problem, and on October 21,
1879, he illuminated a carbon filament light bulb
that glowed continuously for 40 hours.
In the period from 1878 to 1880 Edison and his
associates worked on at least three thousand
different theories to develop an efficient
incandescent lamp. Incandescent lamps make light
by using electricity to heat a thin strip of material
(called a filament) until it gets hot enough to glow.
Many inventors had tried to perfect incandescent
lamps to "sub-divide" electric light or make it
smaller and weaker than it was in the existing arc
lamps, which were too bright to be used for small
spaces such as the rooms of a house.
Edison’s lamp was made up of a filament inside in a
glass bulb from which all air had been removed.
Edison was targeting a high resistance system that
would require far less electrical power than was
used for the arc lamps. He knew such small electric
lights would be suitable for home use.
By January 1879, at his laboratory in Menlo Park,
New Jersey, Edison had built his first high
resistance, incandescent electric light. It worked by
passing electricity through a thin platinum filament
in the glass vacuum bulb, which delayed the
filament from melting. Still, the lamp only burned
for a few short hours. In order to improve the bulb,
Edison needed all the persistence he had learned
years before in his basement laboratory. He tested
thousands and thousands of other materials to use
for the filament. He even thought about using
tungsten, which is the metal used for light bulb
filaments now, but he couldn’t work with it given
the tools available at that time.
Automobiles
Transportation had changed very little between the
time of the Romans and the early 1800s. People
walked, rode horses, or rode in slow vehicles pulled
by horses. At sea, people relied upon wind and
muscle power. The word Automobile means a self-
propelled vehicle. Such vehicles do not need an
animal to move rather they depend on the energy
in a fuel, say coal, petrol, diesel etc.
Nicholas Cugnot, a French engineer in 1769,
invented the first automobile. This automobile was
based on a steam engine. It looked like a massive
tricycle. This ancestor of automobiles can perhaps
still be seen in Paris. In 1873, Amedee Bollee,
another Frenchmen invented an automobile that
was called Obe`issant, a French word meaning
obedient. It looked like a bus.
However, steam engine proved impractical for a
machine that was intended to challenge the speed
of a horse-and-buggy. The invention of the practical
automobile had to await the invention of a workable
internal combustion engine. An internal combustion
engine in contrast to a steam engine that burns its
fuel outside the engine is any engine that uses the
explosive combustion of a liquid fuel to push a
piston within a cylinder - the piston's movement.
The most common internal combustion engine type
is gasoline powered. Others include those fueled by
diesel, turns a crankshaft that then turns the car
wheels via a chain or a drive shaft.
Gottlieb Daimler and Wilhelm Maybach built the
first automobile based on internal combustion
engine in Germany in 1889. Powered by a 1.5 hp,
two-cylinder gasoline engine, it had a four-speed
transmission and traveled at 10 mph. Another
German, Karl Benz, also built a gasoline-powered
car the same year. The gasoline-powered
automobile, or motor car, remained largely a
curiosity for the rest of the nineteenth century, with
only a handful being manufactured in Europe and
the United States.
The first automobile to be produced in quantity was
the 1901 Curved Dash Oldsmobile, which was built
in the United States by Ransom E. Olds. Modern
automobile mass production, and its use of the
modern industrial assembly line, is credited to
Henry Ford of Detroit, Michigan, who had built his
first gasoline-powered car in 1896. Ford began
producing his Model T in 1908, and by 1927, when it
was discontinued; over 18 million had rolled off the
assembly line.
Electric battery
A battery produces electricity using two different
metals in a chemical solution. A chemical reaction
between the metals and the chemicals frees more
electrons in one metal than in the other. One end of
the battery is attached to one of the metals; the
other end is attached to the other metal. The end
that frees more electrons develops a positive
charge and the other end develops a negative
charge. If a wire is attached from one end of the
battery to the other, electrons flow through the wire
to balance the electrical charge.
There is evidence that primitive batteries were used
in Iraq and Egypt as early as 200 B.C. for
electroplating and precious metal gilding. In 1748,
Benjamin Franklin coined the term battery to
describe an array of charged glass plates.
Around the 1790s, through numerous observations
and experiments, Luigi Galvani, an Italian professor,
caused muscular contraction in a frog by touching
its nerves with electrostatically charged metal.
Later, he was able to cause muscular contraction by
touching the nerve with different metals without a
source of electrostatic charge. He thought that
animal tissue contained an innate vital force, which
he termed "animal electricity."
In fact, it was Volta's famous disagreement with
Galvani's theory of animal electricity that led Volta,
in 1800, to build the voltaic pile to prove that
electricity did not come from the animal tissue but
was generated by the contact of different metals in
a moist environment.
Most historians attribute the invention of the
battery to Alessandro Volta since his voltaic pile was
the first battery that produced a reliable, steady
current of electricity.
Volta’s invention
was to give rise to
electrochemistry,
electromagnetism
and the modern
applications of
electricity. Also
Galvani's idea of
animal electricity
were not useless
either. Galvani’s
research was soon
to develop into
electrophysiology
and modern biology.
Loudspeaker
From time immemorial people have been
communicating through sounds because it is one of
the most efficient and economical means of
communication. However, sound produced by a
person has its limitations. It can only travel a
certain distance, in other words it is sometimes not
loud enough to reach the target. This need was the
mother of the invention of loudspeakers. A
loudspeaker is a type of transducer, i.e. it is a
device that can transform energy in one type of
wave, motion, signal, excitation or oscillation into
another. Loudspeakers convert electrical energy
into mechanical energy, which in turn is converted
into sound energy. Obviously a loudspeaker could
not have been invented before electricity was
discovered and means for producing it invented.
A loudspeaker is a type of transducer, i.e. it is a
device that can transform energy in one type of
wave, motion, signal, excitation or oscillation into
another. Loudspeakers convert electrical energy
into mechanical energy, which in turn is converted
into sound energy.
Alexander Bell patented the first loudspeaker as
part of his telephone in 1876. Ernst Siemens, a
German in 1878, soon invented an improved
version. The modern design of moving-coil
loudspeaker was established by Oliver Lodge, a
physicist and writer, involved the development of
the wireless telegraph in 1898. He was also the first
person to transmit a radio signal (in 1894, one year
before Marconi did so), and received international
recognition for his work. Since large powerful
permanent magnets of the correct shape for
loudspeaker construction were not freely available
at reasonable cost at that time, these loudspeakers,
found in early radio systems, utilized
electromagnets.
The quality of sound produced from loudspeaker
systems until the 1950s was rather poor.
Developments in cabinet technology (e.g. acoustic
suspension) and changes in materials used in the
actual loudspeaker, such as the move away from
simple paper cones, led to audible improvements.
Paper cones (or doped paper cones, where the
paper is treated with a substance to improve its
performance) are still in use today, and can provide
good performance. Polypropylene and aluminium
are also used as diaphram materials.
Microphone
A microphone is a device that converts sound
waves into electricity. Microphones were first used
with early telephones and then radio transmitters.
In 1827, an English scientist ,Sir Charles
Wheatstone, coin ed the phrase "microphone."
In 1876, Emile Berliner invented the first
microphone used as a telephone voice transmitter.
He had seen a Bell Company telephone
demonstration at the U.S. Centennial Exposition and
was inspired to find ways to improve the newly
invented telephone. The Bell Telephone Company
was impressed with what the inventor came up with
and bought Berliner's microphone patent for
$50,000.
In 1878 David Edward Hughes, invented the carbon
microphone, which was later developed during the
1920s. Hughes's microphone was the early model
for the various carbon microphones now in use.
With the invention of the radio, new broadcasting
microphones were created. The ribbon microphone
was invented in 1942 for radio broadcasting.
In 1964, Bell Laboratories researchers James West
and Gerhard Sessler received a patent for an
electret microphone. The electret microphone offers
greater reliability, higher precision, lower cost, and
a smaller size. It revolutionized the microphone
industry, with almost one billion manufactured each
year.
During the 1970's, dynamic and condenser mics
were developed, allowing for a lower sound level
sensitivity and a clearer sound recording.
Microwave oven
The microwave oven is the first new method of
cooking since man invented fire. You may be
surprised to know that no one ever set out to
discover the microwave oven. It was an accidental
discovery.
Way back in 1940, two scientists, Sir John Randall
and Dr. H. A. Boot, invented a device called a
magnetron to produce microwaves in their lab at
England's Birmingham University. The magnetron is
a radar (radio detecting and ranging) device that
bounces microwaves off the enemy's war machines
to detect their presence.
In 1946, an American engineer named Dr. Percy
Spencer, a self-taught engineer was performing
tests on a magnetron tube when he got strong
cravings for the chocolate bar that was in his
pocket. He reached into his pocket only to be
surprised by a nice gooey mess. Doc Spencer was
well aware of the fact that the magnetron produced
heat, but he did not sense any. However, he
suspected that the magnetron had melted the
chocolate, not his body heat. He needed to test his
theory that the magnetron was cooking his food. He
sent out for a bag of popcorn and placed it in front
of the magnetron tube. The popcorn popped all
over the floor!! Next morning he tried cooking up
some eggs, one of his fellow colleagues was very
curious and happened to get a bit too close - the
egg blew up in his face.
Raytheon set out to make the first microwave oven.
Since the magnetrons were used to make radars,
they gave it the name Radar Range. Soon he
succeeded in building the oven, but it was very
large. After all, the 1940's were not known for
miniaturization of electronics.
Now, that we know the story of invention lets know
how food is cooked in a microwave oven.
Microwaves are a type of radio waves. They can
pass through the outer layer of food (just as they
pass through the walls of a house) and heat the
interior directly. They do this by setting molecules of
water, fats, sugars, and other food components into
rapid motion. Since a molecule in the middle of a
piece of food can receive
this energy as readily as
one on the exterior,
microwaves are sometimes
said to cook food from the
inside out. In practice,
however, they are generally
absorbed in the outer inch
or so of a piece of food, which is why thick items
that are cooked in a microwave oven can still be
raw inside.
Magnets
The most popular legend about the discovery of
magnets is that of an elderly Cretan shepherd
named Magnes. Legend has it that Magnes was
herding his sheep in an area of Northern Greece
called Magnesia, about 4,000 years ago. Suddenly
both, the nails in his shoes and the metal tip of his
staff became firmly stuck to the large, black rock on
which he was standing. To find the source of
attraction he dug up the Earth to find lodestones
(load = lead or attract). Lodestones contain
magnetite, a natural magnetic material. This type
of rock was subsequently named magnetite, after
either Magnesia or Magnes himself. People soon
realized that magnetite not only attracted objects
made of iron, but when made into the shape of a
needle and floated on water, magnetite always
pointed in a north-south direction creating a
primitive compass. This led to an alternative name
for magnetite, that of lodestone or "leading stone".
For many years following the discovery of lodestone
magnetism was just a curious natural phenomenon.
The Chinese developed the mariner's compass
some 4500 years ago. The earliest mariner's
compass comprised a splinter of loadstone carefully
floated on the surface tension of water.
Peter Peregrinus is credited with the first attempt to
separate fact from superstition in 1269. Peregrinus
wrote a letter describing everything that was
known, at that time, about magnetite. However,
significant progress was made only with the
experiments of William Gilbert in 1600 in the
understanding of magnetism. It was Gilbert who
first realized that the Earth was a giant magnet and
that magnets could be made by beating wrought
iron. He also discovered that heating resulted in the
loss of induced magnetism.
In 1820 Hans Christian Oersted, a scientist from
Danemark, demonstrated that magnetism was
related to electricity by bringing a wire carrying an
electric current close to a magnetic compass which
caused a deflection of the compass needle. This
lead to the knowledge that whenever current flows
there is be an associated magnetic field in the
surrounding space, or more generally that the
movement of any charged particle will produce a
magnetic field.
Electric motor
A broad definition of "motor" would be: any device
that converts electrical energy into motion. As is so
often the case with inventions, the credit for
development of the electric motor belongs to more
than one individual. It was through a process of
development and discovery beginning with Hans
Oersted's discovery of electromagnetism in 1820
and involving additional work by William Sturgeon,
Joseph Henry, Andre Marie Ampere, Michael
Faraday, and a few others.
The story of invention of electric motor dates back
to 1831, when an American physicist Joseph Henry
published an article in a science journal, describing
a device that was basically the reverse of the
electric generator. Instead of converting mechanical
movement into an electric current, like the
generator, his device used electric current to
produce mechanical movement. Henry's motor was
the first to be constructed, although inefficiency
limited its potential. In 1834 American blacksmith
Thomas Davenport improved the motor's operating
principles, using four magnets, two fixed and two
revolving. Davenport used his motor to operate his
own drills and wood-turning lathes. He went on to
incorporate his motor in the electric railway,
electric trolley, electric piano, and electric printing
press.
Meanwhile the English inventor scientist, Michael
Faraday, had been making advances of his own.
Faraday, having learned of Hans Christian Oersted's
discovery that an electric current created a
magnetic field, which could deflect a compass
needle, set out to reverse the results and create an
electric current from a magnetic field.
The motor built by Faraday consisted of a free-
hanging wire dipping into a pool of mercury. A
permanent magnet was placed in the middle of the
pool. When a current was passed through the wire,
the wire rotated around the magnet. This motor is
often demonstrated in school physics classes, but
brine is sometimes used in place of the toxic
mercury. This is the simplest form of a class of
electric motors called homopolar motors.
The modern DC motor was invented by accident in
1873, when Zénobe Gramme, a Belgian electrical
engineer, connected a spinning dynamo to a
second similar unit, driving it as a motor.
Airplane
Men have dreamed of being able to fly for centuries.
Leonardo da Vinci (1452-1519) imagined devices
that would enable human beings to fly and drew
pictures of such machines. In 1782 the Montgolfier
brothers invented a hot air balloon that floated over
Paris for 25 minutes. The development of powered
balloons, however, did not lead to practical aircraft.
Around the turn of the twentieth century, dozens of
people were working to invent the airplane. The
period of active experimentation begins in 1891,
when noted German engineer Otto Lilienthal began
experimenting with hang gliders. Lilienthal took
seriously the ideas advocated by Sir George Cayley
almost a hundred years earlier. Through an
extensive study of birds and bird flight, Cayley
realized that the lift function and the thrust function
of bird wings were separate and distinct, and could
be imitated. Following in Lilienthal's footsteps,
efforts to invent an airplane became commonplace
in Europe. Although an occasional aircraft flew
farther than 100 meters (about the length of a
football field), this level of performance was
exceptional. It was at such a juncture that the
legendary Wright brothers entered the arena.
The American brothers Wilbur and Orville Wright,
inspired by Lilienthal, decided in 1899 to master
gliding before attempting powered flight. First, for a
few months, the Wright brothers built and flew
several kites, testing and perfecting their new ideas
about a flight control system. In 1900, they used
this system on a man-carrying glider for the first
time. Before they risked their own necks, they flew
the glider as a kite, controlling it from the ground.
They flew three biplane (has two wings, one above
the other) gliders and by 1902 they had developed
a fully practical biplane glider. Their great
innovation was that their glider could have been
balanced and controlled in every direction, by
combining the actions of warping (twisting) the
wings and turning the rudder for lateral control, and
by using a device called an elevator for up and
down movements without any need for the pilot to
swing his torso and legs in order to control the flight
direction. All flight control today has developed from
this 1902 Wright glider.
The development of the airplane is a twentieth-
century phenomenon. From the first powered
aircraft to the creation of the supersonic transport,
airplanes improved quickly. This was aided by the
innovations of World War I and World War II.
Demand for air travel led to the creation of an
industry including aircraft construction companies,
engine and equipment makers, as well as firms that
built and operated airports.
Atom
The first person to propose that matter was made of atoms,
and then write it down, was a Greek philosopher named
Democritus. The Greek concept of the atom was unlike ours:
to their minds a pickle was composed of small green sour
atoms, a fire of hot light bright atoms, etc.
A number of scientists, starting probably with Newton in the
late 1600s, proposed a corpuscular, or atomic, model. But it
wasn't until the late 1700s/early 1800s that a British scientist,
John Dalton, proposed that all matter was made of atoms and
actually used it to explain a bunch of experiments that had
been done on gases, and to calculate atomic weights of
elements. In addition to Dalton's work suggesting the atom
because of fixed chemical combining rules, there was the
astoundingly successful kinetic theory of gases, a subject of
intense interest in the nineteenth century, which relies utterly
on gases being made of little bits of flying matter.
However, Dalton did not prove that atoms existed...he just
showed that the concept of atoms was useful and helped
explain a lot of data. Probably the best direct probe of the
atom was first done by Rutherford and his student, C.T.R.
Wilson, who invented the cloud chamber and used it to show
that when thin gold foil is bombarded by helium nuclei (alpha
particles), the particles are occasionally deflected by a very
large angle, but usually pass straight through. This gave rise
to the realization that the gold was composed of atoms, with a
tiny nucleus at the middle, which could occasionally collide
with an alpha particle and send it flying.
Laser
A laser is a device that creates and amplifies a
narrow, intense beam of coherent light.
The word laser is an acronym for light amplification
by stimulated emission of radiation, although
common usage today is to use the word as a noun --
laser -- rather than as an acronym -- LASER.
Light is a kind of radiation emitted by atoms. Atoms
radiate light in random directions at random times.
The result is incoherent light -- a technical term for
what you would consider a jumble of photons going
in all directions.
The trick in generating coherent light -- of a single
or just a few frequencies going in one precise
direction -- is to find the right atoms with the right
internal storage mechanisms and create an
environment in which they can all cooperate -- to
give up their light at the right time and all in the
same direction.
The principle of the laser was first known in 1917,
when the most eminent scientist Albert Einstein
described the theory of stimulated emission.
However, it was not until the late 1940s that
engineers began to utilize this principle for practical
purposes. At the onset of the 1950's several
different engineers were working towards the
harnessing of energy using the principal of
stimulated emission. Notable amongst them were:
Charles Townes at the University of Columbia;
Joseph Weber at the University of Maryland and
Alexander Prokhorov and Nikolai G Basov at the
Lebedev Laboratories in Moscow. These engineers
were working towards the creation of what was
termed a MASER (Microwave Amplification by the
Stimulated Emission of Radiation), a device that
amplified microwaves as opposed to light and soon
found use in microwave communication systems.
Townes and the other engineers believed it to be
possible create an optical maser, a device for
creating powerful beams of light using higher
frequency energy to stimulate what was to become
termed the lasing medium.
However Theodore Maiman was the first scientist
who made the first Laser in 1960 using a ruby
crystal. But still Both Townes and Prokhorov were
awarded the Nobel Science Prize in 1964.
The Laser was a remarkable technical breakthrough,
but in its early years it was something of a
technology without a purpose. It was not powerful
enough for use in the beam weapons envisioned by
the military, and its usefulness for transmitting
information through the atmosphere was severely
hampered by its inability to penetrate clouds and
rain. Almost immediately, though, some began to
find uses for it. Maiman and his colleagues
developed some of the first Laser weapons sighting
systems and other engineers developed powerful
lasers for use in surgery and other areas where a
moderately powerful, pinpoint source of heat was
needed. Today, for example, Lasers are used in
corrective eye surgery, providing a precise source of
heat for cutting tissue.
Vaccination
Vaccination is a term coined by Edward Jenner, an
English country doctor, for the process of
administering live, albeit weakened, microbes to
patients, with the intent of conferring immunity
against a targeted form of a related disease agent.
In common speech, 'vaccination' and 'immunization'
generally mean the same thing.
Edward Jenner had studied nature and his natural
surroundings since childhood. He had always been
fascinated by the rural old wives tale that milkmaids
could not get smallpox. He believed that there was
a connection between the fact that milkmaids only
got a weak version of smallpox, the non-life
threatening cowpox, but did not get smallpox itself.
A milkmaid who caught cowpox got blisters on her
hands and Jenner concluded that it must be the pus
in the blisters that somehow protected the
milkmaids.
In 1796, Jenner decided to try out a theory he had
developed. A young boy called James Phipps would
be his guinea pig. He took some pus from cowpox
blisters found on the hand of a milkmaid called
Sarah. She had milked a cow called Blossom and
had developed the tell-tale blisters. Jenner ‘injected’
some of the pus into James. This process he
repeated over a number of days gradually
increasing the amount of pus he put into the boy.
He then deliberately injected Phipps with smallpox.
James became ill but after a few days made a full
recovery with no side effects. It seemed that Jenner
had made a brilliant discovery.
Jenner encountered the prejudices and
conservatism of the English society at that time.
People could not accept that a country doctor had
made such an important discovery and Jenner was
publicly humiliated when he publisized his findings.
However, eventually his discovery had to be
accepted – a discovery that was to change the
world. So successful was Jenner's discovery, that in
1840 the government of the day banned any other
treatment for smallpox other than Jenner's. Jenner
did not patent his discovery as it would have made
the vaccination more expensive and out of the
reach of many. It was his gift to the world.
Clocks and watches
Clocks, whether on the wall, or computers, or on our
wrists in the form of watches, are the standard
method for measuring time. The concept of time
dates back to the ancient times. The prehistoric
man began to come up with very primitive methods
of measuring time by simple observation of the
stars, changes in the seasons, day and night. This
was necessary for planning nomadic activity,
farming, sacred feasts, etc.
The earliest time measurement devices before
clocks and watches were the sundial, hourglass and
water clock.
The forerunners to the sundial were poles and sticks
as well as larger objects such as pyramids and other
tall structures. Later the more formal sundial was
invented. It is generally a round disk marked with
the hours like a clock. It has an upright structure
that casts a shadow on the disk - this is how time is
measured with the sundial.
The hourglass was also used in ancient times. It was
made up of two-rounded glass bulbs connected by a
narrow neck of glass between them. When the
hourglass is turned upside down, a measured
amount of sand particles stream through from the
top to bottom bulb of glass. Today's egg timers are
modern versions of the hourglass.
Another ancient device to measure time was the
water clock or clepsydra. It was a container that
was evenly marked and had a spout in which water
dripped out. As the water dripped out of the
container one could note by the water level against
the markings what time it was.
One of the earliest clocks was invented by Pope
Sylvester II in the 990s. Later on chimes or bells
were added as well as dials to the clocks. Early
clocks were powered by falling weights and springs.
Clocks with pendulums came into existence later, in
1657.

Electric clocks came


into being after 1850,
but were not popular
until the twentieth
century. An electric
motor with alternating
current powers these
clocks. Later digital
clocks with LCD (liquid
crystal displays) rivaled
the electric clocks.
Quartz clocks use the
vibrations of a quartz
crystal to power the
clock.
Watches are different
than clocks in that they are carried about or worn.
The first watches appeared by the 1500s and were
made by hand. They were very fancy and their
faces were covered by fine metal strips to protect
the markings. Watches were manufactured by
machine in the mid 1800s.
At first watches had knobs on the outside that the
wearer wound to keep the mainspring powered
inside. Later on, self-winding watches derived power
from the movement of the wearer. With the advent
of quartz crystal watches with digital displays, the
need for motors for watches has decreased.
Wheel
The wheel is probably the most important
mechanical invention of all time. Nearly every
machine built since the beginning of the Industrial
Revolution involves a single, basic principle
embodied in one of mankind’s truly significant
inventions. It’s hard to imagine any mechanized
system without the wheel or the idea of a
symmetrical component moving in a circular motion
on an axis. From tiny watch gears to automobiles,
jet engines and computer disk drives, the principle
is the same.
The earliest known use of this essential invention
was a potter’s wheel that was used at Ur in
Mesopotamia (part of modern day Iraq} as early as
3500 BC. A Sumerian (ancient Iraq) pictograph,
dated about 3500 BC, shows a sledge equipped with
wheels. The idea of wheeled transportation may
have come from the use of logs for rollers, but the
oldest known wheels were wooden disks consisting
of three carved planks clamped together by
transverse struts.
The first use of the wheel for transportation was
probably for Mesopotamian chariots in 3200 BC. A
wheel with spokes first appeared on Egyptian
chariots around 2000 BC, and wheels seem to have
developed in Europe by 1400 BC without any
influence from the Middle East. Because the idea of
the wheel appears so simple, it’s easy to assume
that the wheel would have simply "happened" in
every culture when it reached a particular level of
sophistication. However, this is not the case. The
great Inca, Aztec and Maya civilizations reached an
extremely high level of development, yet they
never used the wheel. Even in Europe, the wheel
evolved little until the beginning of the nineteenth
century.
During the Industrial Revolution the wheel became
the central component of technology, and was used
in thousands of ways in countless different
mechanisms.
Spoked wheels appeared about 2000 BC, when they
were in use on chariots in Asia Minor. Later
developments included iron hubs (centerpieces)
turning on greased axles, and the introduction of a
tire in the form of an iron ring that was expanded by
heat and dropped over the rim and that on cooling
shrank and drew the members tightly together.
Photocopy machine
Photocopying is a process, which makes paper
copies of documents and other visual images
quickly and cheaply. James Watt (the man who
invented the steam engine) invented a letter-
copying machine that can safely called the
forerunner of the digital photocopier in the 1800's.
though Chester Carlson is said to be the inventor of
photocopying. He was a part time researcher and
inventor. His job at his office required him to make a
large number of copies of important papers. Carlson
who was arthritic, found this a painful and tedious
process. This prompted him to conduct experiments
in the area of photoconductivity, through which
multiple copies could be made with minimal effort.
Carlson experimented with "electrophotography" in
his kitchen and in 1938, applied for a patent for the
process. He made the first "photocopy" using a zinc
plate covered with sulfur. The word "10-22-38
Astoria" were written on a microscope slide, which
was placed on top of more sulfur and under a bright
light. After the slide was removed, a mirror image of
the words remained. Carlson tried to sell his
invention to some companies, but because the
process was still underdeveloped he failed. At the
time multiple copies were made using carbon paper
or duplicating machines, and people did not feel
any dire need for an electronic machine. Between
1939 and 1944, over 20 companies including IBM
and GE, both of which did not believe that there was
a significant market for copiers, turned down
Carlson.
In 1944, the Battelle Memorial Institute, a non-profit
organization in Columbus, Ohio, contracted with
Carlson to refine his new process. Over the next five
years, the institute conducted experiments to
improve the process of electrophotography. In 1947
Haloid (a small New York based organisation
manufacturing and selling photographic paper at
that time) approached Battelle to obtain a license to
develop and market a copying machine based on
this technology.
Haloid felt that the word "electrophotography" was
too complicated and did not have good recall value.
After consulting a professor of classical language at
Ohio State University, Haloid and Carlson changed
the name of the process to "Xerography", derived
from Greek words which meant "dry writing". Haloid
decided to call the new copier machines "Xerox"
and in 1948, the word Xerox was trademarked.
In the early 1950s, RCA (Radio Corporation of
America) introduced a variation on the process
called Electrofax where images are formed directly
on specially coated paper and rendered with a toner
dispersed in a liquid.
Aluminum
The common metal, Aluminum is not found
naturally, but rather as an ore (alumina) typically
found in most soils and in high concentrations in a
clay-like substance called bauxite. It is one of the
more abundant minerals (approximately 8% of the
minerals in the earth’s crust) and is the earth’s
most abundant metal. Bauxite is usually mined at or
near the surface of the earth, using power shovels
or draglines. The word bauxite (a name given to a
number of aluminum oxides) comes from Les Baux,
France where it was discovered in 1821. Bauxite is
mined and typically exported from Guinea,
Australia, and Brazil. Bauxite ranges in color from
yellowish-white to dark, plum-like red, and may
come in the forms of a fine powder or a clay.
Ever since ancient times, people have been using
alum (one of the aluminum compounds found in
nature). People used alum clays to make pottery,
utensils, dyes, and medicines. The earliest known
uses of alum date back to 5,300 BC.
It was not until the 1800’s that chemists were able
to successfully separate aluminum metal from the
other elements in the bauxite clay. In the 1800’s
aluminum was popularly called "the metal from
clay." When first commercially available, aluminum
was near priceless in value (in fact, Emperor
Napoleon III served his most honored guests with
aluminum forks and spoons, instead of the usual
gold or silver!). Making aluminum metal from
bauxite requires enormous amounts of energy
because an electrochemical ("loosening by
electricity") process is necessary to remove the
other elements that have bonded with the
aluminum. Recycling aluminum requires
approximately 95% less energy than producing the
metal from bauxite, because the base metal is
already identified and is then simply re-melted.
Glass
Glass is an inorganic solid material that is usually
clear or translucent with different colors. It is hard,
brittle, and stands up to the effects of wind, rain or
sun. Glass has been used for various kinds of
bottles and utensils, mirrors, windows and more.
Archaeological findings indicate that glass was first
made in the Middle East, sometime in the 3000's
B.C. It appears to have been produced as far back
as the second millennium BC by the Egyptians &
perhaps the Phoenicians. In the beginning glass
manufacturing was slow and costly. Glass melting
furnaces were very small and hardly produced
enough heat to melt glass properly. In ancient
times, glass was a luxury item and few people could
afford it.
An unknown person discovered the blowpipe in the
1st century B.C. on the Phoenician coast. Glass
manufacturing flourished in the Roman empire and
spread from Italy to all countries under Roman
jurisdiction. Due to mass production, glass become
an everyday object and was removed from the list
of luxuries. The glassblowing innovation, along with
the backing of the powerful Roman Empire, made
glass products more accessible to the common
people. As the size of the Roman Empire increased,
the art of glass making spread to many other
countries.
During the 15th century in Venice, the first clear
glass called cristallo was invented and then heavily
exported. In 1675, glassmaker George Ravenscroft
invented lead crystal glass by adding lead oxide to
Venetian glass. On March 25, 1902, Irving W
Colburn patented the sheet glass drawing machine,
making the mass production of glass for windows
possible. A patent for a "glass-shaping machine"
was granted to Michael Owen on August 2, 1904.
The immense production of bottles, jars, etc. owes
its inception to this invention.
Portland Cement
Cement is an ultra-fine gray powder that binds sand
and rocks into a mass or matrix of concrete
necessary to build houses, bridges and skyscrapers.
Ever since the civilizations first started to build,
people have sought a material that would bind
stones into a solid. The Assyrians and Babylonians
used clay for this purpose, and the Egyptians
advanced to the discovery of lime and gypsum
mortar as a binding agent for building such
structures as the Pyramids. The Greeks and the
Romans further developed other kinds of cement
that produced structures of remarkable durability.
The secret of Roman success in making cement can
be traced to the mixing of slaked lime with
pozzolana, a volcanic ash from Mount Vesuvius. This
process produced cement capable of hardening
under water. During the Middle Ages this art was
lost and it was not until the scientific spirit of inquiry
revived that we rediscovered the secret of hydraulic
cement -- cement that will harden under water.
Repeated structural failure of the Eddystone
Lighthouse off the coast of Cornwall, England, led
John Smeaton, a British engineer, to conduct
experiments with mortars in both fresh and salt
water. In 1756, these tests led to the discovery that
cement made from limestone containing a
considerable proportion of clay would harden under
water.
There were several other men who experimented in
the field of cement during the period from 1756 to
1830, for example, L. J. Vicat and Lesage in France
and Joseph Parker and James Frost in England.
Before portland cement was discovered and for
some years after its discovery, large quantities of
natural cement were used. Natural cement was
produced by burning a naturally occurring mixture
of lime and clay. Because the ingredients of natural
cement were mixed by nature, its properties varied
as widely as the natural resources from which it was
made.
The invention of Portland cement is generally
credited to Joseph Aspdin, an English mason. In
1824, he obtained a patent for his product, which
he named Portland cement because it produced a
concrete that was the colour of the excellent natural
stone quarried on the Isle of Portland, a limestone
peninsula in the English Channel west of the Isle of
Wight. The name has endured and is used
throughout the world, with many manufacturers
adding their own trade or brand names.
Bicycle
Bicycle is a two-wheeled device for human
transportation. It requires balancing. Its invention
has a long history, with the earliest confirmed
example dating back to early 19th century.
There are many claims for the invention of bicycle-
like machines but most of them are found to be
unreliable. A French man Comte de Sivrac is said to
have developed a two-wheeler, called a celerifere in
1791. The celerifere supposedly had two wheels set
on a ridged wooden frame and no steering,
directional control being limited to that attainable
by leaning. A rider was said to have sat astride the
machine and pushed it along using alternate feet.
We now know that the celerifere never existed and
that it was a mis-interpretation from the well known
French cycle historian Louis Baudry de Saunier in
1891.
The first reliable claim for a practically used bicycle
was by Karl von Drais. Drais invented his
Laufmaschine (running machine) of 1817 that was
called draisine by the press and later velocipede. In
contrast to the non-existent celerifere, von Drais's
machine was steerable. It is said that his interest in
finding an alternative to the horse was the
starvation and death caused by crop failure in 1816
("eighteen hundred and froze to death," following
the volcanic eruption of Tambora). On his first
reported ride from Mannheim on June 12, 1817, he
covered eight miles (13 km) in less than an hour.
The wooden draisine weighed 48 pounds (22 kg),
had brass bushings within the wheel bearings, a
rear-wheel brake and 6 inches (152 mm) trail of the
front-wheel for a self-centering castor effect. This
design sparked a short lived fashion among wealthy
dandies and several thousand copies were built and
used, primarily in Western Europe and in North
America. In Britain, where a D. Johnson introduced
the machine as the "pedestrian curricle," the
Corinthians of the Regency adopted it, although the
poet John Keats referred to it as "the nothing" of the
day. Riders wore out their boots surprisingly rapidly,
and the fashion ended within a couple of years.
Another early design is said to have come from
Kirkpatrick MacMillan, a Scottish blacksmith 1839.
He developed a rear wheel drive design using front
mounted treadles and connecting rods to a rear
crank. He is associated with a the first recorded
instance of a cyclist committing a traffic offence, a
newspaper reporting in 1842 an accident in which
he knocked somebody down and was fined five
British shillings in Glasgow. However, the
documentary evidence between this treadle-drive
machine and MacMillan is thought to be tenuous by
some bicycle historians. Several machines of this
type were made (one of which is available in the
Science Museum (London)), however, this design
did not have long lasting influence.
The design of bicycles and other self-propelling
vehicles progressed gradually. Mechanics
experimented with pedal- or handle-driven three- or
four-wheeled designs, but these suffered greater
weight and higher rolling resistance. However,
Willard Sawyer in Dover successfully manufactured
such vehicles and exported them worldwide in the
1850s.
Iron
People have utilized iron since time immemorial.
Evidence for the use of iron can be found as early
as 4,000 B.C.., but some historians believe it was in
use prior to that time. Initially, iron was expensive
and precious. The technology of how to press it was
a closely guarded military secret.
Originally, iron was obtained from meteorite
deposits, but as those supplies dwindled, the
method of reduction of iron ore in furnaces was
developed. By 1200 B.C.., iron smelting techniques
had made iron a viable material for all weapons and
tools. Because of the relatively low temperatures
attainable from furnaces prior to the Middle Ages,
repeated heating and hammering in a hot charcoal
fire was necessary to work out the carbon and other
impurities and forge the metal into desired shapes.
Around 1300, use of the blast furnace began to
spread throughout Europe. The furnace used a
steady stream of air to increase the intensity of the
heat. The air was produced by bellows or by water
pressure. As a result, iron production was
streamlined and new industries like blacksmithing,
wire drawing, and needle making emerged.
Cast iron was known but not widely used for a long
time. It was too brittle to be worked the same way
as wrought iron. But in 1709, Abraham Darby
started using coke instead of coal as a fuel to
maintain the furnace heat. The result was a product
that was much stronger than previous cast iron.
Darby's method also made it possible to mass
produce iron cheaply and in standard shapes.
Eventually, his foundry at Coalbrookdale, England,
produced cast iron cylinders for train engines.
Englishman Henry Cort developed a grooved rolling
mill in 1783 for the production of iron bars.
Soon thereafter Cort developed a puddling process
for purifying iron from pig iron. In 1856 Sir Henry
Bessemer invented a conversion process that blew
air directly through the molten iron for the efficient
production of steel from iron ore. This was
significant for the iron industry, since steel replaced
iron in importance within fifty years of Bessemer's
invention. The demand for iron as a building
material greatly increased as architects began to
adopt the use of iron building frames, partly as a
fire prevention measure. In 1848 American James
Bogardus built a sugar mill in New York entirely of
cast iron. One unique feature of the building was
that the beams and joists were interchangeable.
Similarly William Jenney, in 1885 in Chicago,
designed the first iron load-bearing frame
commercial building.
Throughout the seventeenth and eighteenth
century iron was used extensively for the railways.
It was used in building locomotives, bridges, and
railway tracks. It played a major role in the rapid
industrial development worldwide and in the
expansion of the British Empire in India. Iron was
also important for the shipping industry. Therefore
new techniques were developed to increase the
strength and durability of iron. The process of
galvanizing developed in the 1700s was later
applied to iron. In galvanization, a coating of zinc
bonded to the iron gave it between fifteen and
thirty years of protection against rust.
The production of iron is considered one of the
greatest industrial developments in history and
remains a guideline for evaluating technological
advancement in developing countries like India.
Toothpaste
Toothpaste is one of those inventions, like the wheel
and the calendar, that doesn't seem to have a
definite inventor or birthdate.
Almost 5000 years ago, people in ancient Egypt
were cleaning their teeth using a recipe of
powdered ashes, myrrh, powdered egg shells and
pumice. It is thought that they applied this paste to
their teeth with their fingers.
Although toothpaste was used as long ago as 500
BC in both China and India; modern toothpastes
were developed in the 1800s. In 1824, a dentist
named Peabody was the first person to add soap to
toothpaste. John Harris first added chalk as an
ingredient to toothpaste in the 1850s. In 1873,
Colgate mass-produced the first toothpaste in a jar.
In 1892, Dr. Washington Sheffield of Connecticut
manufactured toothpaste into a collapsible tube.
Sheffield's toothpaste was called Dr. Sheffield's
Creme Dentifrice. In 1896, Colgate Dental Cream
was packaged in collapsible tubes imitating
Sheffield. Advancements in synthetic detergents
made after WW II allowed for the replacement of
the soap used in toothpaste with emulsifying agents
such as Sodium Lauryl
Sulphate and Sodium
Ricinoleate. A few years later,
Colgate started to add fluoride
to toothpaste.
Thermometer
Thermometer is an instrument for indicating
temperature and measuring its changes. Galileo
Galelei the famous Italian scientist made the first
such device, which consisted of a glass bulb
containing air, connected to a glass tube of small
bore dipping into a colored liquid. Though it was
very sensitive to variations of temperature, it was
not satisfactory as a measuring instrument,
because it was also affected by variations of
atmospheric pressure.
The invention of the type of thermometer familiar at
the present day, containing a liquid hermetically
sealed in a glass bulb with a fine tube attached, is
also generally attributed to Galileo at a slightly later
date, about 1612. Alcohol ~77% was the liquid first
employed, and In order to render the readings of
such, instruments comparable with each other, it
was necessary to select a fixed point or standard
temperature as the zero or starting-point of the
graduations. Instead of making each degree a given
fraction of the volume of the bulb, which would be
difficult in practice, and would give different values
for different liquids, it was soon found to be
preferable to take two fixed points and to divide the
interval between them into the same number of
degrees. It was natural in the first instance to take
the temperature of the human body as one of the
fixed points. In 1701 Sir Isaac Newton proposed a
scale in which the freezing point of water was taken
as zero, and the temperature of the human body as
1a. About the same date (1714) Gabriel Daniel
Fahrenheit proposed to take as zero the lowest
temperature obtainable with a freezing mixture of
ice and salt, and to divide the interval between this
temperature and that of the human body into 12. To
obtain finer graduations the number was
subsequently increased to 96. The freezing point of
water was at that time supposed to be somewhat
variable, because as a matter of fact it is possible to
cool water several degrees below its freezing-point
in the absence of ice. Fahrenheit showed, however,
that as soon as ice began to form the temperature
always rose to the same point, and that a mixture of
ice or snow with pure water always gave the same
temperature. At a later period he also showed that
the temperature of boiling water varied with the
barometric pressure, but that it was always the
same at the same pressure, and might therefore be
used as the second fixed point (as Edmund Halley
and others had suggested) provided that a definite
pressure, such as the average atmospheric
pressure, were specified.
Shortly after Fahrenheit’s death (1736) the freezing
and boiling-points of water were generally
recognized as the most convenient fixed points to
adopt, but different systems of subdivision were
employed. Fahrenheit~ scale, with its small degrees
and its zero below the freezing point possesses
undoubted advantages for meteorological work, and
is still retained in most English-speaking countries.
But for general scientific purposes, the centigrade
system, in which the freezing point of water is
marked 0 and the boiling-point 100, is now almost
universally employed, on account of its greater
simplicity from an arithmetical point of view.
With the advent of semiconductors, solid state
thermometers were invented. Unlike a mercury or
alcohol based thermometer, there is no liquid in
such thermometers. They are based on the
variation of electrical conductivity with
temperature. One can read the temperature on a
digital display of a modern digital thermometer.
Soap
The need for a substance to help remove dirt,
grease, foodstuffs, pitches, bodily excretions, etc.,
has always been a part of the human experience. It
has probably always played a role in human history.
Before soap became an intentionally produced
product, it was extracted from plants like yucca,
soapwort, and horsetail.
The first known written mention of soap was on
Sumerian clay tablets dating about 2500 B.C. The
tablets spoke of the use of soap in washing wool.
Another Sumerian tablet, describes soap made from
water, alkali, and cassia oil. Historical evidence
shows that Egyptians bathed regularly and that
they combined animal and vegetable oils with
alkaline salts to create a soaplike substance for
washing.
Ancient Roman legend gives soap its name: From
Mount Sapo, where animals were sacrificed, rain
washed a mixture of melted animal fats and wood
ashes down into the Tiber River below. There, the
soapy mixture was discovered to be useful for
washing clothing and skin. It is believed that the
Romans acquired the knowledge of soap from the
Gauls.
With the fall of the Roman Empire, the popularity of
soap and bathing in Europe went into decline.
Though many non-European cultures maintained
bathing practices throughout the medieval period, it
wasn't until several centuries later that bathing
would come back into fashion in Europe.
Soapmakers' guilds began to spring up in Europe
during the seventh century. Secrets of the trade
were closely guarded. The training and promotion of
craftsmen within the trade was highly regulated.
Southern European countries, such as Italy, Spain,
and France were early production centers for soap
as they had an excellent supply of oil from olive
trees and barilla ashes, which they used to make
lye.
The English began manufacturing fancy varieties of
soap during the twelfth century although soap was
a heavily taxed luxury item, In Colonial America,
soap was made by women producing it out of their
homes seasonally. The commercial production of
soap did not start until the early 1600's when
enterprising soapmakers from England began
arriving in USA.
Scientific advancements that affected the
soapmaking trade began with Nicholas Leblanc, a
French chemist who invented a process that allowed
inexpensive production of soda ash. Michel
Chevreul's in the early 1800s made significant
discoveries about the relationship of fats, glycerine,
and fatty acids and thus laid the groundwork for the
chemistry of soaps and fats. During the mid-1800s,
Belgian chemist Ernest Solvay discovered the
ammonia process that improved the methods for
extracting soda ash from common salt. This
increased the availability and quality of soda ash for
soapmaking.
As a result of these scientific achievements, soap
became a popular and easy to-obtain commodity. It
also began to take on many different identities:
soap for bathing, soap for clothing, soap for
cleaning.
Cinema
Cinema can be described as the projection of
moving photographic pictures to an audience. The
invention of cinema cannot be credited to any one
person alone--various individuals played key roles in
the development of machines that could project
moving photographic pictures to an audience, but
still Louis Le Prince and the Lumière brothers are
credited with the invention of cinema. Le Prince
built the single lens camera in 1888, and the
Lumière developed Cinématographe in 1895 .
The evolution of this technique was dependent on a
handful of technical principles. In 1832 Joseph
Antoine Ferdinand Plateau (1801-83) constructed a
device which created the illusion of movement
through the successive presentation of still images
showing phases of that movement. Photography,
the permanent record of optically-formed images on
light-sensitive material, was perfected
simultaneously by Louis Daguerre (1787-1851) and
William Henry Fox Talbot (1800-77) in 1839. Thus,
the technical principles for cinematography were
essentially understood by that date.
By the mid-1880s the key requirements for
cinematography were: the need for long strips of
pictures and a method of moving the strip
intermittently at a fast enough rate to record
movement smoothly--around sixteen pictures per
second. In 1885, George Eastman (1854-1932)
introduced a paper-based roll film. The Le Prince
single lens camera made use of Eastman's paper
roll film to record a sequence of images. It is
claimed that in October 1888, it was used to take 20
consecutive pictures of Leeds Bridge at a rate of
about 16 pictures per second. The camera and the
frames still exist, although no definite proof of their
date exists. But an English patent applied for by Le
Prince (1842-1890?), a French showman engineer
and inventor, on January 1888 describes the
principles of cinematography.
Thomas Alva Edison, the famous American
inventor, introduced moving pictures to the
general public when he opened in New York in
1894the first Kinetoscope parlour. The Kinetoscope
was a coin-operated machine, which gave a 'show'
lasting about twenty seconds for a single viewer--
the original peepshow.
The brothers Louis (1864-1948) and Auguste (1862-
1954) Lumière were the most successful
photographic plate manufacturers in France. They
first saw a Kinetoscope in
the summer of 1894.
Impressed by the
demonstration but put off
by the high prices
demanded by Edison's
agents, they decided to
develop their own
product. In February 1895,
they patented a combined
camera and projector,
which used an intermittent claw derived from the
mechanism used in sewing machines to move the
cloth.
Tape recorder
A tape recorder records sound on a magnetic tape.
Its basic principle was worked out theoretically in an
1888 by the English inventor Oberlin Smith (1840-
1926). It was, however, 10 years later that Smith's
ideas were adapted and the first working magnetic
tape recorder was introduced.
In 1898, a Danish inventor and physicist Valdemar
Poulsen invented a device that recorded and
reproduced sounds by residual magnetization of a
steel wire. The telegraphone, as it was called, was
demonstrated at the 1900 World Fair in Paris, but
the world took little notice of this new technology at
the time. The invention of magnetic recording tape
is variously attributed to J. A. O'Neill (b.1909), who
is said to have created a paper version in 1927 in
the United States, and the German engineer Fritz
Pfleumer, who in 1928 developed a tape made by
bonding a thin coating of oxide to strips of either
paper or film. It was Pfleumer who filed the first
audiotape patent in 1929. There is no doubt,
however, that audiotape was an improvement over
existing methods, such as records, for recording and
storing sound. The tape was easier to use, store,
and edit, and less expensive to produce. Later, a
German electronics firm AEG produced a prototype
of a record/playback machine, called a
magnetophon, based on Pfleumer's idea, but using
plastic tape in 1935. A German company BASF
further improvised this technique.
The first public tape recording was made by the
London Philharmonic Orchestra at BASF in 1936.
Other recording devices were also being developed
concurrently. In 1937 or 1938, Marvin Camras, a
U.S. inventor, built a magnetic wire recorder, using
a variation on the early work of Poulsen. His
recorder used a revolutionary magnetic recording
head to record around the wire symmetrically. Early
versions of his recorders were used during World
War II for training and strategic purposes, such as to
simulate battle sounds at noninvasion locations and
therefore mislead the enemy. Camras went on to
develop his recording techniques for home use. He
invented the first magnetic coatings that modern
recording tape is based on; these coatings are used
in videotape, computer tape, and floppy disks for
personal computers. He also discovered high-
frequency bias, used on almost all tape recorders
today to improve sound quality, and developed
multi-track tape recording, magnetic sound for
motion pictures, videotape recorders, a variety of
improved recording heads, and stereophonic sound
reproduction. Thin plastic tapes have become the
medium universally used in tape recorders. The
tapes have magnetic coatings consisting of
magnetically active particles, most commonly iron
oxide and chromium dioxide. Each of these
particles, in effect, is a tiny permanent magnet
embedded in the coating. As the tape passes
around the five magnetic heads of the tape
recorder, sound is recorded, replayed, or erased
according to the heads that are activated. A
recording head magnetizes the passing tape in such
a way that the magnetic particles on it are
realigned. The resulting magnetization pattern
remains on the tape, which may be rewound and
replayed as often as desired, until erased or
changed. The audiocassette, introduced in 1963 by
the Philips Company of the Netherlands, was made
possible by Pfleumer's earlier development of
audiotape. Audiotape was used in a reel-to-reel
format, which was complicated and unwieldy, since
the user had to thread tape through the machine
and onto a take-up reel. Until the audiocassette
format, sound recording technology had remained
primarily a professional tool. Because of the ease
and economy of the audio-cassette, magnetic
recording
tape
recordings
could
compete
with long
playing
(LP)
records
(LPs). The
cassette
was

immediately popular because it made inserting,


advancing, and rewinding a tape fast and easy; it
could also be stopped and ejected at any point in
the tape.

Potrebbero piacerti anche