Oscilloscopes come in two basic flavours, analogue and digital. The
term ?analogue? is a throwback to electronic circuits performing
mathematical functions, eg integration, summing etc. The term is now
applied to all electronics where the signal is manipulated without
recourse to converting the signal into digital form. Digital
electronics manipulate the signal in bits (and bytes) where zeros and
ones represent the data as a binary number. Computers are (now)
basically digital devices because most of the data and processing is
digital from source to display. But the (being superseded) 15 pin ?D?
connector is an analogue output, as generally demanded by CRT type
monitors. So the signal has to be converted from a bunch of zeros and
ones to three voltages, one for each colour. This is performed by a
digital to analogue converter (more usually known as a D to A). You
can guess what an A to D is. However, even in a computer, the power
supply has to be analogue and most other signal sources are either
analogue somewhere in the path or require analogue circuitry to make
them function. So it is rare for a complex system to be absolutely
digital or absolutely analogue. This is true too for oscilloscopes.
Whilst oscilloscopes *can* be totally analogue, it makes much more
sense for ?housekeeping? duties such as switch functions and on screen
display to be digitally controlled. As for the digital scope, unless
it is a logic analyser when all the signals are either on or off (eg.
zero volts and five volts) it will require an analogue front end.
Reason being is that low voltage signals, of perhaps a few millivolts
or less, cannot be satisfactorily digitised by an A to D, especially
at high frequency.
Early digital scopes employed a CRT so the signal had to be finally
converted to analogue, but more recent ones employ LCDs which actually
require digital signals because the pixels are addressed digitally.
This naturally preserves accuracy because the display will always stay
exactly where it is told to, whereas a CRT can drift with temperature,
age and brightness.
Though there can be blurring, the primary definition of a digital
scope is that after amplification (or attenuation) the signal is
processed in digital form, thereby allowing it to be readily stored,
measured, manipulated, magnified and displayed however you want. Hence
the display is not really a real time representation of the signal
because it goes through the internal computer. This has advantages for
very slow or very fast signals where in the first instance an analogue
scope would have a slow flashing trace and in the second instance
might not be able to display short, infrequent events. These short
events can be ?captured? by a digital scope. So what you usually see
on a digital scope is a steady picture which is easy on the eye, and
if stored will still be there when you come back from lunch.
More experienced engineers still prefer analogue scopes for many
applications though because there is the ?twiddle? factor on the
knobs. Sometimes a digital scope display can be misinterpreted and
effects such as jitter are better perceived in real time. But it?s all
a case of using tools. There are times when one might use an analogue
scope on a digital circuit when one is looking for ?funnies? and vice
versa when you want to measure the voltage or phase of an analogue
signal. Fundamentally though, a digital scope is more accurate, easier
to read (and misread), more compact, and inherently more amenable to
computer interfacing.
So if one is considering the purchase of a scope, one should really
try to get prior familiarisation with both types to assess suitability
for what you want to do. There are some really good bargains in second
hand analogue scopes but digital scopes are newer and trendier
(digital, buzzword), therefore you get less for your buck.
BTW, UK spelling, so read ?analog? for ?analogue? etc. if you are foreign ;-) |