Sei sulla pagina 1di 4

Internet working (a combination of the words inter ("between") and networking; it is not internet-working or international-network) is the practice of connecting

a computer network with other networks through the use of gateways that provide a common method of routing information packets between the networks. The resulting system of interconnected networks is called an internetwork, or simply an internet. The most notable example of internetworking is the Internet , a network of networks based on many underlying hardware technologies, but unified by an internetworking protocol standard, the Internet Protocol Suite , often also referred to as TCP/IP. The smallest amount of effort to create an internet (an internetwork, not the Internet), is to have two LAN's of computers connected to each other via a router. Simply using either a switch or a hub to connect two local area networks together doesn't imply internetworking, it just expands the original LAN.

Interconnection of networks
Internetworking started as a way to connect disparate types of networking technology, but it became widespread through the developing need to connect two or more local area network via some sort of wide area network. The original term for an internetwork was catenet The definition of an internetwork today includes the connection of other types of computer networks such as personal area network . The network elements used to connect individual networks in the ARPANET, the predecessor of the Internet, were originally called gateways, but the term has been deprecated in this context, because of possible confusion with functionally different devices. Today the interconnecting gateways are called Internet routers. Another type of interconnection of networks often occurs within enterprises at the Link Layer of the networking model, i.e. at the hardware-centric layer below the level of the TCP/IP logical interfaces. Such interconnection is accomplished with network bridges and network switches. This is sometimes incorrectly termed internetworking, but the resulting system is simply a larger, single subnetwork, and no internetworking protocol, such as Internet Protocol, is required to traverse these devices. However, a single computer network may be converted into an internetwork by dividing the network into segments and logically dividing the segment traffic with routers. The Internet Protocol is designed to provide an unreliable (not guaranteed) packet service across the network. The architecture avoids intermediate network elements maintaining any state of the network. Instead, this function is assigned to the endpoints of each communication session. To transfer data reliably, applications must utilize an appropriate Transport Layer protocol, such as Transmission Control Protocol (TCP), which provides a reliable stream. Some applications use a simpler, connection-less transport protocol, User Datagram Protocol (UDP), for tasks which do not require reliable delivery of data or that require real-time service, such as video streaming or voice chat.

Networking models

Two architectural models are commonly used to describe the protocols and methods used in internetworking. The Open System Interconnection (OSI) reference model was developed under the auspices of the International Organization for Standardization (ISO) and provides a rigorous description for layering protocol functions from the underlying hardware to the software interface concepts in user applications. Internetworking is implemented in the Network Layer (Layer 3) of the model. The Internet Protocol Suite, also called the TCP/IP model of the Internet was not designed to conform to the OSI model and does not refer to it in any of the normative specifications in Requests for Comment and Internet standards. Despite similar appearance as a layered model, it uses a much less rigorous, loosely defined architecture that concerns itself only with the aspects of logical networking. It does not discuss hardware-specific low-level interfaces, and assumes availability of a Link Layer interface to the local network link to which the host is connected. Internetworking is facilitated by the protocols of its Internet Layer
Analog and digital signals are used to transmit information, usually through electric signals. In both these technologies, the information, such as any audio or video, is transformed into electric signals. The difference between analog and digital technologies is that in analog technology, information is translated into electric pulses of varying amplitude. In digital technology, translation of information is into binary format (zero or one) where each bit is representative of two distinct amplitudes.

Analog signal
An analog or analogue signal is any continuous signal for which the time varying feature (variable) of the signal is a representation of some other time varying quantity, i.e., analogous to another time varying signal. For example, in an analog audio signal, the instantaneous voltage of the signal varies continuously with the pressure of the sound waves. It differs from a digital signal, in which a continuous quantity is represented by a discrete function which can only take on one of a finite number of values. The term analog signal usually refers to electrical signals; however, mechanical, pneumatic, hydraulic, and other systems may also convey analog signals. An analog signal uses some property of the medium to convey the signal's information. For example, an aneroid barometer uses rotary position as the signal to convey pressure information. In an electrical signal, the voltage, current, or frequency of the signal may be varied to represent the information. Any information may be conveyed by an analog signal; often such a signal is a measured response to changes in physical phenomena, such as sound, light, temperature, position, or pressure. The physical variable is converted to an analog signal by a transducer. For example, in sound recording, fluctuations in air pressure (that is to say, sound) strike the diaphragm of a microphone which induces corresponding fluctuations in the current produced by a coil in an electromagnetic microphone, or the voltage produced by a condensor microphone. The voltage or the current is said to be an "analog" of the sound.

An analog signal has a theoretically infinite resolution. In practice an analog signal is subject to electronic noise and distortion introduced by communication channels and signal processing operations, which can progressively degrade the signal-to-noise ratio. In contrast, digital signals have a finite resolution. Converting an analog signal to digital form introduces a constant lowlevel noise called quantization noise into the signal which determines the noise floor, but once in digital form the signal can in general be processed or transmitted without introducing additional noise or distortion. Therefore as analog signal processing systems become more complex, they may ultimately degrade signal resolution to such an extent that their performance is surpassed by digital systems. This explains the widespread use of digital signals in preference to analog in modern technology. In analog systems, it is difficult to detect when such degradation occurs. However, in digital systems, degradation can not only be detected but corrected as well. The definition of an analog signal can also reveal its shortcomings. Analog signals, because they rely on minute variations in voltage or current, are very susceptible to interference from outside sources. Even a small change in the analog signal from an outside source, such as electromagnetic interference, can cause dramatic changes in the signal and how the signal is interpreted.

A digital signal is a physical signal that is a representation of a sequence of discrete values (a quantified discrete-time signal), for example of an arbitrary bit stream, or of a digitized (sampled and analog-to-digital converted) analog signal. The term digital signal can refer to 1. a continuous-time waveform signal used in any form of digital communication. 2. a pulse train signal that switches between a, also known as a line coded signal, for example a signal found in digital electronics or in serial communications using digital baseband transmission, or a pulse code modulation (PCM) representation of a digitized analog signal. A signal that is generated by means of a digital modulation method (digital passband transmission), produced by a modem, is in the first case considered as a digital signal, and in the second case as converted to an analog signal. Digital is defined as, "of, relating to, or using calculation by numerical methods or by discrete units." In a digital signal, numerical values are finite. For example, if we were counting from zero to one, only the two values might be represented. In an analog system, everything between zero and one are represented as well. Analogue V Digital Signals

Potrebbero piacerti anche