Google Answers Logo
View Question
Q: DVI and VGA ( Answered,   0 Comments )
Subject: DVI and VGA
Category: Computers > Hardware
Asked by: jamesbonduk-ga
List Price: $2.00
Posted: 08 May 2005 08:22 PDT
Expires: 07 Jun 2005 08:22 PDT
Question ID: 519164
What is the difference between VGA and DVI

Request for Question Clarification by politicalguru-ga on 08 May 2005 11:14 PDT
Dear James, 

Are you asking because you're interested in buying a flat screen, or
bcause you would like an in-depth analysis of the technical features
of each?
Subject: Re: DVI and VGA
Answered By: tisme-ga on 08 May 2005 11:49 PDT
Hello jamesbonduk-ga,

I will be quoting excerpts from to explain the
different. If you want a good understanding, I recommend that you read
the articles in their entirety:


The main difference is that VGA is an analog standard for computer
monitors that was first marketed in 1987, whereas DVI is a newer and
superior digital technology that has the potential to provide a much
better picture.

Here are two relevant excerpts from Wikipedia for your convenience:

"Existing standards, such as VGA, are analog and designed for CRT
based devices. As the source transmits each horizontal line of the
image, it varies its output voltage to represent the desired
brightness. In a CRT device, this is used to vary the intensity of the
scanning beam as it moves across the screen. However, in digital
displays, instead of a scanning beam there is an array of pixels and a
single brightness value must be chosen for each. The decoder does this
by sampling the voltage of the input signal at regular intervals. When
the source is also a digital device (such as a computer), this can
lead to distortion if the samples are not taken at the centre of each
pixel, and in general the crosstalk between adjacent pixels is high."

"DVI takes a different approach. The desired brightness of the pixels
is transmitted as a list of binary numbers. When the display is driven
at its native resolution, all it has to do is read each number and
apply that brightness to the appropriate pixel. In this way, each
pixel in the output buffer of the source device corresponds directly
to one pixel in the display device, whereas with an analog signal the
appearance of each pixel may be affected by its adjacent pixels as
well as by electrical noise and other forms of analog distortion."

Please let me know if you require further clarification or assistance
with this question.

All the best,


Search Strategy:

Used and did a search for "dvi" and "vga"
There are no comments at this time.

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  

Google Home - Answers FAQ - Terms of Service - Privacy Policy