Sei sulla pagina 1di 7

SRM UNIVERSITY FACULTY OF ENGINEERING AND TECHNOLOGY DEPT OF MECHANICAL ENGINEERING SUBJECT CODE /TITLE: ME0002 ROBOTICS ENGINEERING

AND APPLICATIONS YEAR III CYCLE TEST 2 PART A ANSWER KEY

1.

Sensor is a transducer that is used to make a measurement of a physical variable of interest. Transducer is a device which converts the one form of information into another form without changing the information content.

2. It is to identify the object the image represents. This identification problem is accomplished using
the extracted feature information.

It is a hardware device used to capture and store the digital image. ucer.#egative feedback is incorporated to minimi$e discrepancies between the output state and th e inputcontrol setting PART B

!. " mechanical or electromechanical system for control of the position or speed of an output transd

5. PRO IMITY SENSOR:


Sensor which senses the presence or absence of the object without having physical contact between the object.

RANGE SENSORS:

ACOUSTIC SENSORS:

%. "daptive control
A!"#$%&' ()*$+), is the control method used by a controller which must adapt to a controlled system with parameters which vary& or are initially uncertain. 'or example& as an aircraft flies& its mass will slowly decrease as a result of fuel consumption( a control law is needed that adapts itself to such changing conditions. "daptive control is different from robust control in that it does not need a priori information about the bounds on these uncertain or time)varying parameters( robust control guarantees that if the changes are within given bounds the control law need not be changed& while adaptive control is concerned with control law changing themselves. *arameter estimation The foundation of adaptive control is parameter estimation. +ommon methods of estimation include recursive least s,uares and gradient descent. -oth of these methods provide update laws which are used to modify estimates in real time .i.e.& as the system operates/. 0yapunov stability is used to derive these update laws and show convergence criterion .typically persistent excitation/. *rojection .mathematics/ and normali$ation are commonly used to improve the robustness of estimation algorithms.

1ptimal control O#$%-", ()*$+), $.')+/& an extension of the calculus of variations& is a mathematical optimi$ation method for deriving control policies. The method is largely due to the work of 0ev *ontryaginand his collaborators in the Soviet 2nion and 3ichard -ellman in the 2nited States. 4eneral method 1ptimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. " control problem includes a cost functional that is a functionof state and control variables. "n )#$%-", ()*$+), is a set of differential e,uations describing the paths of the control variables that minimi$e the cost functional. The optimal control can be derived using *ontryagin5s maximum principle .a necessary condition also known as *ontryagin5s minimum principle or simply *ontryagin5s *rinciple& or by solving the 6amilton78acobi7-ellman e,uation .a sufficient condition/. 0. MACHINE VISION: M"(.%*' &%1%)* .9:/ is the technology and methods used to provide imaging)based automatic inspection and analysis for such applications as automatic inspection& process control& and robot guidance in industry. A##,%("$%)*1 The primary uses for machine vision are automatic inspection and industrial robot guidance. +ommon machine vision applications include ,uality assurance& sorting& material handling& robot guidance& and optical gauging. M'$.)!1 9achine vision methods are defined as both the process of defining and creating an 9: solution& and as the technical process that occurs during the operation of the solution. 6ere the latter is addressed. "s of 2;;%& there was little standardi$ation in the interfacing and configurations used in 9:. This includes user interfaces& interfaces for the integration of multi)component systems and automated data interchange. #onetheless& the first step in the 9: se,uence of operation is ac,uisition of an image& typically using cameras& lenses& and lighting that has been designed to provide the differentiation re,uired by subse,uent processing. 9: software packages then employ various digital image processing techni,ues to extract the re,uired information& and often make decisions .such as pass<fail/ based on the extracted information. I-"2%*2 =hile conventional .2> visible light/ imaging is most commonly used in 9:& alternatives include imaging various infrared bands&?12@ line scan imaging& > imaging of surfaces and A)ray imaging.Bey divisions within 9: 2> visible light imaging are monochromatic vs. color& resolution& and whether or not the imaging process is simultaneous over the entire image& making it suitable for moving processes. The most commonly used method for > imaging is scanning based triangulation which utili$es motion of the

product or image during the imaging process. 1ther > methods used for machine vision are time of flight& grid based and stereoscopic. The imaging device .e.g. camera/ can either be separate from the main image processing unit or combined with it in which case the combination is generally called a smart camera or smart sensor. =hen separated& the connection may be made to speciali$ed intermediate hardware& a frame grabber using either a standardi$ed .+amera 0ink& +oaA*ress/ or custom interface. 9: implementations also have used digital cameras capable of direct connections .without a framegrabber/ to a computer via 'ire=ire& 2S- or 4igabit Cthernetinterfaces. Though the vast majority of machine vision applications are solved using 2 dimensional imaging& machine vision applications utili$ing > imaging are growing niche within the industry. 1ne method is grid array based systems using pseudorandom structured light system as employed by the 9icrosoft Binect system circa 2;12. I-"2' #+)('11%*2 "fter an image is ac,uired& it is processed 9achine vision image processing methods include

Stitching<3egistrationD +ombining of adjacent 2> or > images. 'iltering .e.g. morphological filtering/ ThresholdingD Thresholding starts with setting or determining determining a gray value that will be useful for the following steps. The value is then used to separate portions of the image& and sometimes to transform each portion of the image simply black and white based on whether it is below or above that grayscale value. *ixel countingD counts the number of light or dark pixels SegmentationD *artitioning a digital image into multiple segments to simplify and<or change the representation of an image into something that is more meaningful and easier to analy$e. Cdge detectionD finding object edges +olor "nalysisD Identify parts& products and items using color& assess ,uality from color& and isolate features using color. -lob discovery E manipulationD inspecting an image for discrete blobs of connected pixels .e.g. a black hole in a grey object/ as image landmarks. These blobs fre,uently represent optical targets for machining& robotic capture& or manufacturing failure. #eural net processingD weighted and self)training multi)variable decision making *attern recognition including template matching. 'inding& matching& and<or counting specific patterns. This may include location of an object that may be rotated& partially hidden by another object& or varying in si$e. -arcode& >ata 9atrix and F2> barcodeF reading 1ptical character recognitionD automated reading of text such as serial numbers

4auging<9etrologyD measurement of object dimensions .e.g. in pixels& inches or millimeters/ +omparison against target values to determine a Fpass<failF or Fgo<no goF result. 'or example& with code or bar code verification& the read value is compared to the stored target value. 'or gauging& a measurement is compared against the proper value and tolerances. 'or verification of alpha) numberic codes& the 1+35d value is compared to the proper or target value. 'or inspection for blemishes& the measured si$e of the blemishes may be compared to the maximums allowed by ,uality standards.

O3$#3$1 " common output from machine vision systems is pass<fail decisions. These decisions may in turn trigger mechanisms that reject failed items or sound an alarm. 1ther common outputs include object position and orientation information from robot guidance systems. "dditionally& output types include numerical measurement data& data read from codes and characters& displays of the process or results& stored images& alarms from automated space monitoring 9: systems& and process control signals.

4. W+%$' 1.)+$ *)$'1 )* $.' 5),,)6%*2:

Potrebbero piacerti anche