You miss the crucial difference between an "analog computer" and a digital
computer: algorithms.
An analog device (and a DDA – digital
differential analyzer – which is its approximate equivalent using digital
technology) is a single-purpose circuit, built in a space domain, but operating
in a time domain. The circuit elements may be (in the analog case) op-amps,
summers, multipliers, approximators (output = f(input), usually
piece-wise linear approximations with diode-resistor networks, sometimes using
ek × input junction-voltage
characteristics), integrators (output = ∫ input dt) and
differentiators (dinput/dt). (A DDA provides digital
analogs to these.)
Most analog "computers" have changeable wiring, so that
the elements can be reused for the next device, but hardly any have analog
switches that allow the elements to simulate different functions within a single
run. The free variable is always time, although the time can be ignored
when you're interested in the relationship among different functions of time:
element switching would cause serious problems in the free variable, thus in the
function of the analog device.
In brief, each analog (or DDA) element is
dedicated to one specific purpose at all times of interest.
In a
digital computer, the elements are time-shared — the adder, for instance,
(assuming a single ALU arguendo but without loss of generality) is used
for each summation in the whole run, by multiplexing the operands into and out
of it. All elements of the digital computer are similarly multiplexed.
The
multiplexing is what the algorithm is about, at this level of discourse: in
executing the algorithm, the multiplexing of the elements follows the math of
the algorithm.
This multiplexing is not feasible in the analog device (even
if the time issue could be resolved), since the characteristics of the analog
signals do not cater to stabilizing the signals when the algorithm would require
them to be stable (i. e., when the particular signal is not an operand to
a computing element). This stability requirement is met, in the digital
computer, by some storage element (or state element, if you prefer –
depending on the discourse level you're at).
The DDA allows for storage, and
thus allows the time variable to be decoupled from real time, but is
still a dedicated device. True, a DDA can be simulated by an algorithm, but
that's a simulation, not a true DDA.
From a slightly different
perspective: an analog gizmo uses real-time as the free variable in the
computation it's analogizing: it's really the only free variable that's
available to it. The "computing" elements are distinct in space.
A digital
computer 'rotates' the computation into a realm where time is an artifact of the
algorithm (if it appears at all), and uses real-time to follow the algorithm and
make multiple uses of a small number of available computational
facilities.
Analog computers inherently operate in the time domain, where
algorithms are irrelevant; digital computers operate in a more space-like
domain, using algorithms.
--- --Bill. NAL: question the answers,
especially mine. [ Reply to This | Parent | # ]
|