Sei sulla pagina 1di 4

Caroline D. Turla BS Psychology 2202 Writing Activity No.

December 5, 2011

Operant Conditioning Chamber (Skinners Box)

Skinner determined that a behavior followed by a reinforcing stimulus results in an increased probability of that behavior occurring in the future. An operant conditioning chamber (also known as the Skinner box) is a laboratory apparatus used in the experimental analysis of behavior to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University (Masters in 1930 and doctorate in 1931). It is used to study both operant conditioning and classical conditioning. The structure forming the shell of a chamber is a box large enough to easily accommodate the organism being used as a subject. (Commonly used model organisms include rodentsusually lab ratspigeons, and primates). It is often sound-proof and light-proof to avoid distracting stimuli. Operant chambers have at least one (a.) operandum (or "manipulandum"), and often two or more, that can automatically detect the occurrence of a behavioral response or action. Typical operanda for primates and rats are (b.) response levers; if the subject presses the lever, the opposite end moves and closes a switch that is monitored by a computer or other programmed device. Typical operanda for pigeons and other birds are response keys with a switch that closes if the bird pecks at the key with sufficient force. The other minimal requirement of a conditioning chamber is that it has a means of delivering a primary

reinforcer or unconditioned stimulus like food (usually pellets) or water. It can also register the delivery of a conditioned (c.) reinforcer or a "token".

Despite such a simple configuration, one operandum and one feeder, it is possible to investigate many psychological phenomena. Modern operant conditioning chambers typically have many operanda, like many response levers, two or more feeders, and a variety of devices capable of generating many stimuli, including lights, sounds, music, figures, and drawings. Levers. A Skinner box typically contains one or more levers which an animal can press, one or more stimulus lights and one or more places in which reinforcers like food can be delivered. The animal's presses on the levers can be detected and recorded and a contingency between these presses, the state of the stimulus lights and the delivery of reinforcement can be set up, all automatically. Food Pellet. Serves as an reinforcer or token to one of which is being experimented. One of his experiments (B.F. Skinner) uses rats. Initially there may be a few pellets in the hopper where reinforcers are delivered, plus a few scattered nearby, to allow the rat to discover that the hopper is a likely source of food. Once the rat is happy

eating from the hopper he can be left in Skinner box and the pellet dispenser operated every now and then so the rat becomes accustomed to eating a pellet from the hopper each time the dispenser operates (the rat is probably learning to associate the sound of the dispenser operating with food - a piece of classical conditioning which is really incidental to the instrumental learning task at hand). Once the animal has learned the food pellets are reinforcing and where they are to be found, it would, however, still probably take some time for the rat to learn that bar-pressing when the SD light was on produced food. Electrified nets or floors/SD Light. This serves as a punishment given in the experimentation, that electrical charges can be given to the animals; or lights of different colors that give information about when the food is available. In order to learn an operant contingency by trial and error the operant must be some behavior which the animal performs often anyway. Instead of allowing the rat to learn by trial and error one can use a 'shaping' or 'successive-approximations' procedure. Initially, instead of rewarding the rat for producing the exact behavior we require - lever pressing - he is rewarded whenever he performs a behavior which approximates to lever pressing. The closeness of the approximation to the desired behavior required in order for the rat to get a pellet is gradually increased so that eventually he is only reinforced for pressing the lever. Starting by reinforcing the animal whenever he is in the front half of the Skinner-box, he is later only reinforced if he is also on the side of the box where the lever is. After this the reinforcement occurs if his head is pointing towards the lever and then later only when he approaches the lever, when he touches the lever with the front half of his body, when he puts touches the lever with his paw and so on until the rat is pressing the lever in order to obtain the reinforcer. The rat may still not have completely learned the operant

contingency - specifically he may not yet have learned that the contingency between the operant response and reinforcement is signalled by the SD light. If we now leave him to work in the Skinner-box on his own he will soon learn this and will only press the lever when the SD light is on. Skinner's work did not focus on punishment, and involved a "paw slap" which caused him to conclude, incorrectly, that punishment was ineffective. The Skinner box has received criticism because it does not capture every nuance of the animal's behavior; pushing the lever with a nose or a paw registers as the same response, for example, and light touches of the lever may not be recorded.

Sources:
http://brembs.net/operant/skinnerbox.html http://en.wikipedia.org/wiki/Operant_conditioning_chamber#Structure http://en.wikipedia.org/wiki/File:Skinner_box_scheme_01.png

Potrebbero piacerti anche