vs Analog (Analogue)
Originally posted: 02.jul.2001 ● Last updated: 21.July.2005
say we've entered a
new age, the Information age.
Digital technology, the ability to represent data by a series of 0's and 1's,
is what makes the information age possible.
below come from Webopedia,
and will give you a basic idea of the two contrasting concepts.
Almost everything in the world can be described or represented in one
of two forms: analog or digital. The principal
characteristic of analog representations is that
they are continuous. In contrast, digital representations consist of values measured at discrete
Digital watches are called digital because they go from one value to the next without displaying all intermediate values. Consequently, they can display only a finite number of times of the day. In contrast, watches with hands are analog, because the hands move continuously around the clock face. As the minute hand goes around, it not only touches the numbers 1 through 12, but also the infinite number of points in between.
Early attempts at building computers used analog techniques, but accuracy and reliability were not good enough. Today, almost all computers are digital.
Describes any system based on discontinuous data or events. Computers are digital machines because at their most basic level they can distinguish between just two values, 0 and 1, or off and on. There is no simple way to represent all the values in between, such as 0.25. All data that a computer processes must be encoded digitally, as a series of zeroes and ones.
The opposite of digital is analog. A typical analog device is a clock in which the hands move continuously around the face. Such a clock is capable of indicating every possible time of day. In contrast, a digital clock is capable of representing only a finite number of times (every tenth of a second, for example).
In general, humans experience the world analogically. Vision, for example, is an analog experience because we perceive infinitely smooth gradations of shapes and colors. Most analog events, however, can be simulated digitally. Photographs in newspapers, for instance, consist of an array of dots that are either black or white. From afar, the viewer does not see the dots (the digital form), but only lines and shading, which appear to be continuous. Although digital representations are approximations of analog events, they are useful because they are relatively easy to store and manipulate electronically. The trick is in converting from analog to digital, & back again.
This is the principle behind compact discs (CDs). The music itself exists in an analog form, as waves in the air, but these sounds are then translated into a digital form that is encoded onto the disk. When you play a compact disc, the CD player reads the digital data, translates it back into its original analog form, and sends it to the amplifier and eventually the speakers.
Internally, computers are digital because they consist of discrete units called bits that are either on or off. But by combining many bits in complex ways, computers simulate analog events. In one sense, this is what computer science is all about.
Suppose you're in a car, and the
speedometer measures speed with a dial,
from 0 to 120, with a pointer to indicate your speed. That is an analog gage.
Now suppose you trade in that car and get another that has red LED read-outs,
indicating your sped in 1 MPH (or KPH) increments. That is a digital gage.
Digital technology boils down to: it is or it isn't. There is no limit to how accurate
you can make digital technology. For example, if you wanted your speed to read
out in hundredths of a MPH (or KPH), you could do that. You could build a meter
that would read out in millionths of a MPH, so accuracy is not the issue.
Here is a Google search for the query: digital vs analog