18
Visual display unit (Monitors) A 19" LG flat-panel LCD monitor. A visual display unit, often called simply a monitor or display, is a piece of electrical equipment which displays images generated from the video output of devices such as computers , without producing a permanent record. Most newer monitors typically consist of a TFT LCD, with older monitors based around a cathode ray tube (CRT). The monitor comprises the display device, simple circuitry to generate and format a picture from video sent by the signals source, and usually an enclosure. Within the signal source, either as an integral section or a modular component, there is a display adapter to generate video in a format compatible with the monitor. Problems Dead pixels A few LCD monitors are produced with "dead pixels". Due to the desire for affordable monitors, most manufacturers sell monitors with dead pixels. Almost all manufacturers have clauses in their warranties which claim monitors with fewer than some number of dead pixels are not broken and will not be replaced. The dead pixels are usually stuck with the green, red, and/or blue sub-pixels either individually always stuck on or off. Like image persistence, this can sometimes be partially or fully reversed by using the same method listed below, however the chance of success is far lower than with a "stuck" pixel. It can also sometimes be repaired by physicall y flicking the  pixel, however it is always a possibility for someone to use too much force and rupture the weak screen internals doing this.

Visual Display Unit New 1

Embed Size (px)

Citation preview

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 1/18

Visual display unit (Monitors)

A 19" LG flat-panel LCD monitor.

A visual display unit, often called simply a monitor or display, is a piece of 

electrical equipment which displays images generated from the video output of 

devices such as computers, without producing a permanent record. Most newer 

monitors typically consist of a TFT LCD, with older monitors based around acathode ray tube (CRT). The monitor comprises the display device, simple

circuitry to generate and format a picture from video sent by the signals source,

and usually an enclosure. Within the signal source, either as an integral section or 

a modular component, there is a display adapter to generate video in a formatcompatible with the monitor.

Problems

Dead pixels

A few LCD monitors are produced with "dead pixels". Due to the desire for affordable monitors, most manufacturers sell monitors with dead pixels. Almost

all manufacturers have clauses in their warranties which claim monitors withfewer than some number of dead pixels are not broken and will not be replaced.

The dead pixels are usually stuck with the green, red, and/or blue sub-pixels either 

individually always stuck on or off.

Like image persistence, this can sometimes be partially or fully reversed by using

the same method listed below, however the chance of success is far lower than

with a "stuck" pixel. It can also sometimes be repaired by physically flicking the

 pixel, however it is always a possibility for someone to use too much force andrupture the weak screen internals doing this.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 2/18

TTL monitors

IBM PC with green monochrome display.

An amber monochrome computer monitor, manufactured in 2007, which uses a

15-pin SVGA connector just like a standard color monitor.

Monitors used with the MDA, Hercules, CGA, and EGA graphics adapters used in

early IBM PC's (Personal Computer) and clones were controlled via TTL logic.Such monitors can usually be identified by a male DB-9 connector used on the

video cable. The disadvantage of TTL monitors was the limited number of colors

available due to the low number of digital bits used for video signaling.

Modern monochrome monitors use the same 15-pin SVGA connector as standard

color monitors. They are capable of displaying 32-bit grayscale at 1024x768

resolution, making them able to interface with modern computers.

TTL Monochrome monitors only made use of five out of the nine pins. One pin

was used as a ground, and two pins were used for horizontal/vertical

synchronization. The electron gun was controlled by two separate digital signals, a

video bit, and an intensity bit to control the brightness of the drawn pixels. Onlyfour shades were possible; black, dim, medium or bright.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 3/18

Central processing unit

.

Die of an Intel 80486DX2 microprocessor (actual size:12×6.75 mm) in its packaging.

A central processing unit (CPU) is an electronic circuit that can executecomputer programs. This broad definition can easily be applied to many early

computers that existed long before the term "CPU" ever came into widespread

usage. The term itself and its initialism have been in use in the computer industry

at least since the early 1960s (Weik 1961). The form, design and implementation

of CPUs have changed dramatically since the earliest examples, but their 

fundamental operation has remained much the same.

Early CPUs were custom-designed as a part of a larger, sometimes one-of-a-kind,

computer. However, this costly method of designing custom CPUs for a particular application has largely given way to the development of mass-produced processors

that are suited for one or many purposes. This standardization trend generally

 began in the era of discrete transistor  mainframes and minicomputers and has

rapidly accelerated with the popularization of the integrated circuit (IC). The IC

has allowed increasingly complex CPUs to be designed and manufactured to

tolerances on the order of nanometers. Both the miniaturization and

standardization of CPUs have increased the presence of these digital devices in

modern life far beyond the limited application of dedicated computing machines.

Modern microprocessors appear in everything from automobiles to cell phones to

children's toys.

 History of CPUs

EDVAC, one of the first electronic stored program computers.

Prior to the advent of machines that resemble today's CPUs, computers such as the

ENIAC had to be physically rewired in order to perform different tasks. These

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 4/18

machines are often referred to as "fixed-program computers," since they had to be

 physically reconfigured in order to run a different program. Since the term "CPU"

is generally defined as a software (computer program) execution device, the

earliest devices that could rightly be called CPUs came with the advent of the

stored-program computer.

The idea of a stored-program computer was already present during ENIAC's

design, but was initially omitted so the machine could be finished sooner. On June30, 1945, before ENIAC was even completed, mathematician John von Neumann

distributed the paper entitled "First Draft of a Report on the EDVAC." It outlined

the design of a stored-program computer that would eventually be completed in

August 1949 (von Neumann 1945). EDVAC was designed to perform a certain

number of instructions (or operations) of various types. Significantly, the

 programs written for EDVAC were stored in high-speed computer memory rather than specified by the physical wiring of the computer.

Discrete transistor and IC CPUs

.Transistor-based computers had several distinct advantages over their 

 predecessors. Aside from facilitating increased reliability and lower power 

consumption, transistors also allowed CPUs to operate at much higher speeds

 because of the short switching time of a transistor in comparison to a tube or relay.

Thanks to both the increased reliability as well as the dramatically increased speed

of the switching elements (which were almost exclusively transistors by this time),

CPU clock rates in the tens of megahertz were obtained during this period. These

early experimental designs later gave rise to the era of specialized supercomputers 

like those made by Cray Inc.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 5/18

Microprocessors

 The integrated circuit from an Intel 8742, an 8-bit microcontrollerthat includes a CPU running at 12 MHz, 128 bytes of RAM, 2048

bytes of EPROM, and I/O in the same chip.

Intel 80486DX2 microprocessor in a ceramic PGA package.

The introduction of the microprocessor in the 1970s significantly affected the

design and implementation of CPUs. Since the introduction of the first

microprocessor (the Intel 4004) in 1970 and the first widely used microprocessor 

(the Intel 8080) in 1974, this class of CPUs has almost completely overtaken all

other central processing unit implementation methods. Combined with the advent

and eventual vast success of the now ubiquitous personal computer , the term

"CPU" is now applied almost exclusively to microprocessors.

Previous generations of CPUs were implemented as discrete components and

numerous small integrated circuits (ICs) on one or more circuit boards.

Microprocessors, on the other hand, are CPUs manufactured on a very small

number of ICs; usually just one. The overall smaller CPU size as a result of being

implemented on a single die means faster switching time because of physical

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 6/18

factors like decreased gate parasitic capacitance. This has allowed synchronous

microprocessors to have clock rates ranging from tens of megahertz to several

gigahertz. Additionally, as the ability to construct exceedingly small transistors on

an IC has increased, the complexity and number of transistors in a single CPU has

increased dramatically. This widely observed trend is described by Moore's law, 

which has proven to be a fairly accurate predictor of the growth of CPU (and other IC) complexity to date.

MOS 6502 microprocessor in a dual in-line package, an extremelypopular 8-bit design.

Related to number representation is the size and precision of numbers that a CPU

can represent. In the case of a binary CPU, a bit refers to one significant place in

the numbers a CPU deals with. The number of bits (or numeral places) a CPU uses

to represent numbers is often called "word size", "bit width", "data path width", or 

"integer precision" when dealing with strictly integer numbers (as opposed to

floating point). This number differs between architectures, and often within

different parts of the very same CPU. For example, an 8-bit CPU deals with a

range of numbers that can be represented by eight binary digits (each digit having

two possible values), that is, 28 or 256 discrete numbers. In effect, integer size sets

a hardware limit on the range of integers the software run by the CPU can utilize.[7]

Integer range can also affect the number of locations in memory the CPU canaddress (locate). For example, if a binary CPU uses 32 bits to represent a memory

address, and each memory address represents one octet (8 bits), the maximum

quantity of memory that CPU can address is 232 octets, or 4 GiB. This is a very

simple view of CPU address space, and many designs use more complexaddressing methods like paging in order to locate more memory than their integer 

range would allow with a flat address space.

Higher levels of integer range require more structures to deal with the additionaldigits, and therefore more complexity, size, power usage, and general expense. It

is not at all uncommon, therefore, to see 4- or 8-bit microcontrollers used in

modern applications, even though CPUs with much higher range (such as 16, 32,

64, even 128-bit) are available. The simpler microcontrollers are usually cheaper,

use less power, and therefore dissipate less heat, all of which can be major design

considerations for electronic devices. However, in higher-end applications, the

 benefits afforded by the extra range (most often the additional address space) are

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 7/18

more significant and often affect design choices. To gain some of the advantages

afforded by both lower and higher bit lengths, many CPUs are designed with

different bit widths for different portions of the device. For example, the IBM

System/370 used a CPU that was primarily 32 bit, but it used 128-bit precision

inside its floating point units to facilitate greater accuracy and range in floating

 point numbers. Many later CPU designs use similar mixed bit width, especiallywhen the processor is meant for general-purpose usage where a reasonable balanceof integer and floating point capability is required.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 8/18

We Lakshwanath & Puvanan would like

to express our heartfelt thanks to our

beloved parents for their support & to

our most dedicatedteacher,Pn.Normanida.

THANK YOU SO MUCH!!!

 

INDEX

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 9/18

1. PG 1-2 - MONITORS

2. PG 3-4  - CENTRAL PROCESSING

UNIT

3. PG 5-7  - MICROPROCESSOR 

4. PG 8-11 - KEYBOARD

5. PG 12-16 - MOUSE

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 10/18

Keyboard computing

In computing, a keyboard is an input device, partially modeled after the typewriter keyboard, which uses an arrangement of buttons or keys, which act as mechanical levers

or electronic switches. A keyboard typically has characters engraved or   printed on thekeys and each press of a key typically corresponds to a single written symbol. However,

to produce some symbols requires pressing and holding several keys simultaneously or insequence. While most keyboard keys produce letters, numbers or  signs (characters), other 

keys or simultaneous key presses can produce actions or computer commands.

In normal usage, the keyboard is used to type text and numbers into a word processor ,

text editor or other program. In a modern computer, the interpretation of keypresses isgenerally left to the software. A computer keyboard distinguishes each physical key from

every other and reports all keypresses to the controlling software. Keyboards are also

used for computer gaming, either with regular keyboards or by using keyboards with

special gaming features, which can expedite frequently used keystroke combinations. Akeyboard is also used to give commands to the operating system of a computer, such as

Windows' Control-Alt-Delete combination, which brings up a task window or shuts downthe machine.

Types

Standard

Standard keyboards, such as the 101-key US traditional keyboard 104-key Windows 

keyboards, include alphabetic characters, punctuation symbols, numbers and a variety of 

function keys. The internationally-common 102/105 key keyboards have a smaller 'leftshift' key and an additional key with some more symbols between that and the letter to its

right (usually Z or Y).[1]

Gaming and multimedia

Keyboards with extra keys, such as multimedia keyboards, have special keys for 

accessing music, web and other oft-used programs, a mute button, volume  buttons or 

knob and standby (sleep) button. Gaming keyboards have extra function keys, which can be programmed with keystroke macros. For example, 'ctrl+shift+y' could be a keystroke

that is frequently used in a certain computer game. Shortcuts marked on color-coded keys

are used for some software applications and for specialized for uses including word processing, video editing, graphic design and audio editing.

Thumb-sized

Smaller keyboards have been introduced for laptops, PDAs, cellphones or users who have

a limited workspace. The size of a standard keyboard is dictated by the practical

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 11/18

consideration that the keys must be large enough to be easily pressed by fingers. To

reduce the size of the keyboard, the numeric keyboard to the right of the alphabetic

keyboard can be removed, or the size of the keys can be reduced, which makes it harder to enter text. Another way to reduce the size of the keyboard is to reduce the number of 

keys and use chording keyer , i.e. pressing several keys simultaneously. For example, the

GKOS keyboard has been designed for small wireless devices. Other two-handedalternatives more akin to a game controller , such as the AlphaGrip, are also used as a way

to input data and text. Another way to reduce the size of a keyboard is to use smaller 

 buttons and pack them closer together. Such keyboards, often called a "thumbboard"(thumbing) are used in some personal digital assistants such as the Palm Treo and

BlackBerry and some Ultra-Mobile PCs such as the OQO.

Numeric

 Numeric keyboards contain only numbers, mathematical symbols for addition,

subtraction, multiplication, and division, a decimal point, and several function keys (e.g.

End, Delete, etc.). They are often used to facilitate data entry with smaller keyboard-equipped laptops or with smaller keyboards that do not have a numeric keypad.

Non-standard or special-use types

Chorded

A foldable keyboard.

A keyset or chorded keyboard is a computer input device that allows the user to enter 

characters or commands formed by pressing several keys together, like playing a "chord"

on a piano. The large number of combinations available from a small number of keysallows text.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 12/18

Virtual

Virtual keyboards, such as the I-Tech Virtual Laser Keyboard, project an image of a full-

size keyboard onto a surface. Sensors in the projection unit identify which key is being

"pressed" and relay the signals to a computer or  personal digital assistant. There is also a

virtual keyboard, the On-Screen Keyboard, for use on Windows.

Touchscreens

Touchscreens, such as with the iPhone and the OLPC laptop, can be used as a keyboard.(The OLPC initiative's second computer will be effectively two tablet touchscreens

hinged together like a book. It can be used as a convertible Tablet PC where the keyboard

is one half-screen (one side of the book) which turns into a touchscreen virtual keyboard.)

Foldable

Foldable (also called flexible) keyboards are made of soft plastic which can be rolled or folded over for travel.[2] When in use, the keyboard can conform to uneven surfaces, and

it is more resistant to liquids than a standard keyboard. Also can be connected to portable

devices and smartphones.

Some models can be fully immersed in water, making them popular in hospitals andlaboratories, as they can be disinfected.

Laser/Infrared

Some devices have recently been produced which project a keyboard layout onto any flat

surface using a laser . These devices detect key presses via infrared, and can artificially produce the tapping or clicking sound of a physical keyboard through their software.

Alphabetic

The 104-key PC US English QWERTY keyboard layout evolved from the standard

typewriter  keyboard, with extra keys for computing.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 13/18

The Dvorak Simplified Keyboard layout arranges keys so that frequently used keys are

easiest to press, which reduces muscle fatigue when typing common English.

A Hebrew keyboard lets the user type in both Hebrew and the Latin alphabet.

Technology

Key switches

Keyboards on laptops such as this Sony VAIO usually have a shorter travel distance for the keystroke and a reduced set of keys.

"Dome-switch" keyboards (sometimes incorrectly referred to as a membrane keyboards)

are the most common type now in use. When a key is pressed, it pushes down on a rubber dome sitting beneath the key. A conductive contact on the underside of the dome touches

(and hence connects) a pair of conductive lines on the circuit below. This bridges the gap

 between them and allows electric current to flow (the open circuit is closed). A scanningsignal is emitted by the chip along the pairs of lines in the matrix circuit which connects

to all the keys. When the signal in one pair becomes different, the chip generates a "make

code" corresponding to the key connected to that pair of lines.

Connection types

There are several ways of connecting a keyboard using cables, including the standard ATconnector commonly found on motherboards, which was eventually replaced by the PS/2 

and the USB connection. Prior to the iMac line of systems, Apple used the proprietary

Apple Desktop Bus for its keyboard connector.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 14/18

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 15/18

Institute, Engelbart endeavored to design a more efficient method for controlling

computers, based on small movements of the hand corresponding to a point on a screen.

Mechanical mouse devices

Operating a mechanical mouse.

1: moving the mouse turns the ball.

2: X and Y rollers grip the ball and transfer movement.3: Optical encoding disks include light holes.

4: Infrared LEDs shine through the disks.

5: Sensors gather light pulses to convert to X and Y velocities.

Apple Macintosh Plus mice, 1986

In 1986 Apple first implemented the Apple Desktop Bus allowing the daisy-chaining

together of up to 16 devices, including arbitrarily many mice and other devices on the

same bus with no configuration whatsoever. Featuring only a single data pin, the bus useda purely polled approach to computer/mouse communications and survived as the

standard on mainstream models (including a number of non-Apple workstations) until

1998 when iMac began the industry-wide switch to using USB. Beginning with the"Bronze Keyboard" PowerBook G3 in May 1999, Apple dropped the external ADB port

in favor of USB, but retained an internal ADB connection in the PowerBook G4 for communication with its built-in keyboard and trackpad until early 2005.

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 16/18

Buttons

In contrast to the motion-sensing mechanism, the mouse's buttons have changed littleover the years, varying mostly in shape, number, and placement. Engelbart's very first

mouse had a single button; Xerox PARC soon designed a three-button model, but

reduced the count to two for Xerox products. After experimenting with 4-button prototypes Apple reduced it back to one button with the Macintosh in 1984, while Unix

workstations from Sun and others used three buttons. OEM bundled mice usually have

 between one and three buttons, although in the aftermarket many mice have always had

five or more.

Apple Mighty Mouse with capacitance triggered buttons

One, two, three or more buttons?

  Five button mouse One button mouse Three-button mouse

 

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 17/18

ICTL PORTFOLIOTOPIC: INPUT DEVICES

DONE BY : LAKSHWANATH A/L

 JAGANATH&

PUVANAN A/L THARMASEELAN

CLASS: 1 ARIF 1

TEACHER: PUAN NORMANIDA

STARTED ON: 27 MARCH 2009

END ON: 29 MARCH 2009

8/6/2019 Visual Display Unit New 1

http://slidepdf.com/reader/full/visual-display-unit-new-1 18/18