Sei sulla pagina 1di 6

Computer monitor

From Wikipedia, the free encyclopedia


Jump to navigationJump to search

A liquid crystal display (LCD) computer monitor

A cathode-ray tube (CRT) computer monitor

A computer monitor is an output device that displays information in pictorial form. A monitor usually
comprises the display device, circuitry, casing, and power supply. The display device in modern
monitors is typically a thin film transistor liquid crystal display (TFT-LCD) with LED
backlighting having replaced cold-cathode fluorescent lamp (CCFL) backlighting. Older monitors
used a cathode ray tube (CRT). Monitors are connected to the computer via VGA, Digital Visual
Interface (DVI), HDMI, DisplayPort, Thunderbolt, low-voltage differential signaling (LVDS) or other
proprietary connectors and signals.
Originally, computer monitors were used for data processing while television sets were used for
entertainment. From the 1980s onwards, computers (and their monitors) have been used for both
data processing and entertainment, while televisions have implemented some computer
functionality. The common aspect ratio of televisions, and computer monitors, has changed from 4:3
to 16:10, to 16:9.
Modern computer monitors are easily interchangeable with conventional television sets. However, as
computer monitors do not necessarily include integrated speakers, it may not be possible to use a
computer monitor without external components.[1]

Contents
 1History
 2Technologies
o 2.1Cathode ray tube
o 2.2Liquid crystal display
o 2.3Organic light-emitting diode
 3Measurements of performance
o 3.1Size
o 3.2Aspect ratio
o 3.3Resolution
o 3.4Gamut
 4Additional features
o 4.1Power saving
o 4.2Integrated accessories
o 4.3Glossy screen
o 4.4Curved designs
o 4.5Directional screen
o 4.63D
o 4.7Touch screen
o 4.8Tablet screens
o 4.9Ultrawide screens
 5Mounting
o 5.1Desktop
o 5.2VESA mount
o 5.3Rack mount
o 5.4Panel mount
o 5.5Open frame
 6Security vulnerabilities
 7See also
 8References
 9External links

History[edit]
Early electronic computers were fitted with a panel of light bulbs where the state of each particular
bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the
engineers operating the computer to monitor the internal state of the machine, so this panel of lights
came to be known as the 'monitor'. As early monitors were only capable of displaying a very limited
amount of information and were very transient, they were rarely considered for program output.
Instead, a line printer was the primary output device, while the monitor was limited to keeping track
of the program's operation.[citation needed]
As technology developed engineers realized that the output of a CRT display was more flexible than
a panel of light bulbs and eventually, by giving control of what was displayed in the program itself,
the monitor itself became a powerful output device in its own right.[citation needed]
Computer monitors were formerly known as visual display units (VDU), but this term had mostly
fallen out of use by the 1990s.

Technologies[edit]
Further information: Comparison of CRT, LCD, Plasma, and OLED and History of display technology
Multiple technologies have been used for computer monitors. Until the 21st century most used
cathode ray tubes but they have largely been superseded by LCD monitors.
Cathode ray tube[edit]
Main article: Cathode ray tube
The first computer monitors used cathode ray tubes (CRTs). Prior to the advent of home
computers in the late 1970s, it was common for a video display terminal (VDT) using a CRT to be
physically integrated with a keyboard and other components of the system in a single large chassis.
The display was monochrome and far less sharp and detailed than on a modern flat-panel monitor,
necessitating the use of relatively large text and severely limiting the amount of information that
could be displayed at one time. High-resolution CRT displays were developed for the specialized
military, industrial and scientific applications but they were far too costly for general use.
Some of the earliest home computers (such as the TRS-80 and Commodore PET) were limited to
monochrome CRT displays, but color display capability was already a standard feature of the
pioneering Apple II, introduced in 1977, and the specialty of the more graphically sophisticated Atari
800, introduced in 1979. Either computer could be connected to the antenna terminals of an ordinary
color TV set or used with a purpose-made CRT color monitor for optimum resolution and color
quality. Lagging several years behind, in 1981 IBM introduced the Color Graphics Adapter, which
could display four colors with a resolution of 320 x 200 pixels, or it could produce 640 x 200 pixels
with two colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of
producing 16 colors and had a resolution of 640 x 350.[2]
By the end of the 1980s color CRT monitors that could clearly display 1024 x 768 pixels were widely
available and increasingly affordable. During the following decade, maximum display resolutions
gradually increased and prices continued to fall. CRT technology remained dominant in the PC
monitor market into the new millennium partly because it was cheaper to produce and offered to
view angles close to 180 degrees.[3] CRTs still offer some image quality advantages[clarification needed] over
LCDs but improvements to the latter have made them much less obvious. The dynamic range of
early LCD panels was very poor, and although text and other motionless graphics were sharper than
on a CRT, an LCD characteristic known as pixel lag caused moving graphics to appear noticeably
smeared and blurry.
Liquid crystal display[edit]
Main articles: Liquid-crystal display and Thin-film-transistor liquid-crystal display
There are multiple technologies that have been used to implement liquid crystal displays (LCD).
Throughout the 1990s, the primary use of LCD technology as computer monitors was in laptops
where the lower power consumption, lighter weight, and smaller physical size of LCDs justified the
higher price versus a CRT. Commonly, the same laptop would be offered with an assortment of
display options at increasing price points: (active or passive) monochrome, passive color, or active
matrix color (TFT). As volume and manufacturing capability have improved, the monochrome and
passive color technologies were dropped from most product lines.
TFT-LCD is a variant of LCD which is now the dominant technology used for computer monitors.[4]
The first standalone LCDs appeared in the mid-1990s selling for high prices. As prices declined over
a period of years they became more popular, and by 1997 were competing with CRT monitors.
Among the first desktop LCD computer monitors was the Eizo L66 in the mid-1990s, the Apple
Studio Display in 1998, and the Apple Cinema Display in 1999. In 2003, TFT-LCDs outsold CRTs for
the first time, becoming the primary technology used for computer monitors.[3] The main advantages
of LCDs over CRT displays are that LCDs consume less power, take up much less space, and are
considerably lighter. The now common active matrix TFT-LCD technology also has less flickering
than CRTs, which reduces eye strain.[5] On the other hand, CRT monitors have superior contrast,
have a superior response time, are able to use multiple screen resolutions natively, and there is no
discernible flicker if the refresh rate[6] is set to a sufficiently high value. LCD monitors have now very
high temporal accuracy and can be used for vision research.[7]
High dynamic range (HDR)[6] has been implemented into high-end LCD monitors to improve color
accuracy. Since around the late 2000s, widescreen LCD monitors have become popular, in part due
to television series, motion pictures and video games transitioning to high-definition (HD), which
makes standard-width monitors unable to display them correctly as they either stretch or crop HD
content. These types of monitors may also display it in the proper width, however they usually fill the
extra space at the top and bottom of the image with black bars. Other advantages of widescreen
monitors over standard-width monitors is that they make work more productive by displaying more of
a user's documents and images, and allow displaying toolbars with documents. They also have a
larger viewing area, with a typical widescreen monitor having a 16:9 aspect ratio, compared to the
4:3 aspect ratio of a typical standard-width monitor.
Organic light-emitting diode[edit]
Main article: Organic light-emitting diode
Organic light-emitting diode (OLED) monitors provide higher contrast and better viewing angles than
LCDs but they require more power when displaying documents with white or bright backgrounds and
have a severe problem known as burn-in.

Measurements of performance[edit]
The performance of a monitor is measured by the following parameters:

 Luminance is measured in candelas per square meter (cd/m2 also called a Nit).
 Color depth is measured in bits per primary color or bits for all colors.
 Gamut is measured as coordinates in the CIE 1931 color space. The
names sRGB or AdobeRGB are shorthand notations.
 Aspect ratio is the ratio of the horizontal length to the vertical length. Monitors usually have the
aspect ratio 4:3, 5:4, 16:10 or 16:9.
 Viewable image size is usually measured diagonally, but the actual widths and heights are more
informative since they are not affected by the aspect ratio in the same way. For CRTs, the
viewable size is typically 1 in (25 mm) smaller than the tube itself.
 Display resolution is the number of distinct pixels in each dimension that can be displayed. For a
given display size, maximum resolution is limited by dot pitch.
 Dot pitch is the distance between sub-pixels of the same color in millimeters. In general, the
smaller the dot pitch, the sharper the picture will appear.
 Refresh rate is the number of times in a second that a display is illuminated. Maximum refresh
rate is limited by response time.
 Response time is the time a pixel in a monitor takes to go from active (white) to inactive (black)
and back to active (white) again, measured in milliseconds. Lower numbers mean faster
transitions and therefore fewer visible image artifacts.
 Contrast ratio is the ratio of the luminosity of the brightest color (white) to that of the darkest
color (black) that the monitor is capable of producing.
 Power consumption is measured in watts.
 Delta-E: Color accuracy is measured in delta-E; the lower the delta-E, the more accurate the
color representation. A delta-E of below 1 is imperceptible to the human eye. Delta-Es of 2 to 4
are considered good and require a sensitive eye to spot the difference.
 Viewing angle is the maximum angle at which images on the monitor can be viewed, without
excessive degradation to the image. It is measured in degrees horizontally and vertically.
Size[edit]
Main article: Display size

The area, height and width of displays with identical diagonal measurements vary dependent on aspect ratio.

On two-dimensional display devices such as computer monitors the display size or view able image
size is the actual amount of screen space that is available to display a picture, video or working
space, without obstruction from the case or other aspects of the unit's design. The main
measurements for display devices are: width, height, total area and the diagonal.
The size of a display is usually by monitor manufacturers given by the diagonal, i.e. the distance
between two opposite screen corners. This method of measurement is inherited from the method
used for the first generation of CRT television, when picture tubes with circular faces were in
common use. Being circular, it was the external diameter of the glass envelope that described their
size. Since these circular tubes were used to display rectangular images, the diagonal measurement
of the rectangular image was smaller than the diameter of the tube's face (due to the thickness of the
glass). This method continued even when cathode ray tubes were manufactured as rounded
rectangles; it had the advantage of being a single number specifying the size, and was not confusing
when the aspect ratio was universally 4:3.
With the introduction of flat panel technology, the diagonal measurement became the actual
diagonal of the visible display. This meant that an eighteen-inch LCD had a larger visible area than
an eighteen-inch cathode ray tube.
The estimation of the monitor size by the distance between opposite corners does not take into
account the display aspect ratio, so that for example a 16:9 21-inch (53 cm) widescreen display has
less area, than a 21-inch (53 cm) 4:3 screen. The 4:3 screen has dimensions of 16.8 in × 12.6 in
(43 cm × 32 cm) and area 211 sq in (1,360 cm2), while the widescreen is 18.3 in × 10.3 in (46 cm
× 26 cm), 188 sq in (1,210 cm2).
Aspect ratio[edit]
Main article: Display aspect ratio
Until about 2003, most computer monitors had a 4:3 aspect ratio and some had 5:4. Between 2003
and 2006, monitors with 16:9 and mostly 16:10 (8:5) aspect ratios became commonly available, first
in laptops and later also in standalone monitors. Reasons for this transition was productive uses for
such monitors, i.e. besides widescreen computer game play and movie viewing, are the word
processor display of two standard letter pages side by side, as well as CAD displays of large-size
drawings and CAD application menus at the same time.[8][9] In 2008 16:10 became the most common
sold aspect ratio for LCD monitors and the same year 16:10 was the mainstream standard
for laptops and notebook computers.[10]
In 2010 the computer industry started to move over from 16:10 to 16:9 because 16:9 was chosen to
be the standard high-definition television display size, and because they were cheaper to
manufacture.
In 2011 non-widescreen displays with 4:3 aspect ratios were only being manufactured in small
quantities. According to Samsung this was because the "Demand for the old 'Square monitors' has
decreased rapidly over the last couple of years," and "I predict that by the end of 2011, production on
all 4:3 or similar panels will be halted due to a lack of demand."[11]
Resolution[edit]
Main article: Display resolution
The resolution for computer monitors has increased over time. From 320x200 during the early
1980s, to 1024x768 during the late 1990s. Since 2009, the most commonly sold resolution for
computer monitors is 1920x1080.[12] Before 2013 top-end consumer LCD monitors were limited to
2560x1600 at 30 in (76 cm), excluding Apple products and CRT monitors. Apple introduced
2880x1800 with Retina MacBook Pro at 15.4 in (39 cm) on June 12, 2012, and introduced a
5120x2880 Retina iMac at 27 in (69 cm) on October 16, 2014. By 2015 most major display
manufacturers had released 3840x2160 resolution displays.
Gamut[edit]
Main article: Gamut
Every RGB monitor has its own color gamut, bounded in chromaticity by a color triangle. Some of
these triangles are smaller than the sRGB triangle, some are larger. Colors are typically encoded by
8 bits per primary color. The RGB value [255, 0, 0] represents red, but slightly different colors in
different color spaces such as AdobeRGB and sRGB. Displaying sRGB-encoded data on wide-
gamut devices can give an unrealistic result.[13] The gamut is a property of the monitor; the image
color space can be forwarded as Exif metadata in the picture. As long as the monitor gamut is wider
than the color space gamut, correct display is possible, if the monitor is calibrated. A picture that
uses colors that are outside the sRGB color space will display on an sRGB color space monitor with
limitations.[14] Still today, many monitors that can display the sRGB color space are not factory
adjusted to display it correctly. Color management is needed both in electronic publishing (via the
Internet for display in browsers) and in desktop publishing targeted to print.

Potrebbero piacerti anche