Connecting the Pixels With VGA

The Video Graphics Array connector (also known as VGA) is an accepted standard connector widely used for computer video input. Originating from the 1987 IBM PS/2 desktop computer and its 15-pin interface, the VGA has gone on to become incredibly ubiquitous on computers, projectors and even high definition television sets. The majority of modern home entertainment systems and gaming consoles include a VGA output with the installation of a video card. While there are a few exceptions, this is the majority of computers and most TV sets will also have a VGA output to connect to a VGA video adapter.

VGA

The main attraction of the VGA is that it provides a compact way to connect and transfer images between machines (PCs and other devices). As a result, all modern computers and most television displays should have a VGA connection. Though not extremely popular, the VGA has grown to be one of the most popular computer connections, largely due to its portability and easy setup. Connecting a VGA to a computer is usually achieved through a video adapter, a kind of bridge that transfers the analog signals of a VGA signal or frame to a digital computer display. The analog signal is first converted into a digital format, compressed and sent over a physical connection to the graphics array (the next step is to change the video format, which is done through the use of a video card or a USB connectable device).

With an analog VGA, all that is needed is to have an analog VGA monitor, which can be purchased for relatively low cost. Because of this low cost, a VGA is the standard graphics interface to a vast majority of computers. One advantage of a VGA connection is that it can be used with an analog projector whereas a parallel port cannot. Another advantage is that VGA has an excellent response time and is able to display graphics and animation at full resolution and quality. A drawback to VGA is that it has limited compatibility with other hardware components like sound cards or printers.

VGA connectors are widely used in conjunction with other video cards and cables. There are two different kinds of video cards in use today, namely, PCI and AGP. PCI is used extensively in notebook computers due to its cheap price and long life span. AGP cards are used more in server computers, where they are more expensive and provide a better performance. A VGA video card and cable will usually be required to connect a VGA monitor to a PCI or AGP board.

DVI is another type of VGA connection that has become popular in recent years. DVI uses a digital interface and is commonly found in HDTVs and other digital devices. DVI also has its own protocol, though DVI-D is becoming less widely found as the prices of DVI-D interface cards decrease. Most modern computers will accept both types of VGA connections. It is important that when using VGA that the video card and cables have been unplugged from the computer prior to connecting the monitor to VGA.

Not all computers have a DVI port. A majority of computers are built with SVGA (store-in-memory) or parallel port, which operates at a higher bandwidth than DVI. DVI is generally found on laptops and netbooks. Both of these formats are commonly referred to as digital video extensions or sometimes as digital video frames. DVI and SVGA are typically found on computers that support both high-resolution LCD and CRT monitors.