decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books

Gear

Groklaw Gear

Click here to send an email to the editor of this weblog.


You won't find me on Facebook


Donate

Donate Paypal


No Legal Advice

The information on Groklaw is not intended to constitute legal advice. While Mark is a lawyer and he has asked other lawyers and law students to contribute articles, all of these articles are offered to help educate, not to provide specific legal advice. They are not your lawyers.

Here's Groklaw's comments policy.


What's New

STORIES
No new stories

COMMENTS last 48 hrs
No new comments


Sponsors

Hosting:
hosted by ibiblio

On servers donated to ibiblio by AMD.

Webmaster
No, I'm not | 443 comments | Create New Account
Comments belong to whoever posts them. Please notify us of inappropriate comments.
No, I'm not
Authored by: Anonymous on Saturday, January 05 2013 @ 09:51 AM EST
You miss the crucial difference between an "analog computer" and a digital computer: algorithms.
I just explained about "partitioning." I work in advanced development now, but I used to work in system design. We were responsible for saying "this goes in analog; this goes in digital; and this goes in software." It can be the same algorithm in any case, but some things work better in one domain or the other.
An analog device (and a DDA – digital differential analyzer – which is its approximate equivalent using digital technology) is a single-purpose circuit, built in a space domain, but operating in a time domain.
Be careful, or you will make the case that regular software is not patentable because it operates at a higher level of abstraction, but that interpreted software (e.g. emulating a DDA) might be patentable in some cases, merely because it emulates a lower level of abstraction.
Most analog "computers" have changeable wiring, so that the elements can be reused for the next device, but hardly any have analog switches that allow the elements to simulate different functions within a single run.
Analog FPGAs are available. I always see a lot of hand-waving about how the reconfigurability should affect patentability (btw, I believe that perhaps it should, but not in the sense most do. Perhaps I should write a paper.) It's still math, whether you can choose different functions on the fly or not.
The free variable is always time, although the time can be ignored when you're interested in the relationship among different functions of time: element switching would cause serious problems in the free variable, thus in the function of the analog device.
Yes, "computing" per se with analog is a challenging problem, which has been overcome in a lot of (most?) cases by quantizing and using circuits in a fully saturated mode. If you're going to be that pedantic, you will have to, at some point, acknowledge that digital computers are analog at the bottom.
In brief, each analog (or DDA) element is dedicated to one specific purpose at all times of interest.
There are examples where this is not true, but I'll let it ride.
In a digital computer, the elements are time-shared — the adder, for instance, (assuming a single ALU arguendo but without loss of generality) is used for each summation in the whole run, by multiplexing the operands into and out of it. All elements of the digital computer are similarly multiplexed.
True.
The multiplexing is what the algorithm is about, at this level of discourse: in executing the algorithm, the multiplexing of the elements follows the math of the algorithm.
And here we get to the crux of the matter. Yes, the algorithm handles the multiplexing, but the algorithm also expresses that which is to be multiplexed. When I write something in Verilog, I can choose to write "synthesizable" verilog -- that which can be directly translated into hardware where time (based on a clock tick) is the free variable according to your reasoning. Or I can write a testbench using Verilog as a purely software language.

But if I do write synthesizable verilog, it can also be used directly as software, OR (e.g. for your DDA), I can actually create hardware out of it. Are you really arguing that the exact same verilog could express a patentable invention if I used it that way, but not if I used it as software?

This multiplexing is not feasible in the analog device (even if the time issue could be resolved), since the characteristics of the analog signals do not cater to stabilizing the signals when the algorithm would require them to be stable (i. e., when the particular signal is not an operand to a computing element).
Yes, it is a lot of effort to "reuse" analog components, as you state, but it is not impossible, and is actually done on a regular basis. But that's just an aside.
This stability requirement is met, in the digital computer, by some storage element (or state element, if you prefer – depending on the discourse level you're at).
And, as you know, there are storage elements available in analog, as well, such as the lowly capacitor (which is, in fact, the basis for dynamic RAM). BTW, did I mention that all digital is analog, and that, for example, even a static RAM is implemented with analog transistors operating in a saturated mode?
The DDA allows for storage, and thus allows the time variable to be decoupled from real time, but is still a dedicated device. True, a DDA can be simulated by an algorithm, but that's a simulation, not a true DDA.
But what happens when the software (what you call a simulation) actually runs fast enough to control stuff in the real world? Or what happens when the hardware engineer decides to reuse his expensive multiplier by adding different registers and making the DDA reconfigurable on the fly?
From a slightly different perspective: an analog gizmo uses real-time as the free variable in the computation it's analogizing: it's really the only free variable that's available to it. The "computing" elements are distinct in space.
OK, but there is nothing in patent law that states that time is the only allowed free variable.
A digital computer 'rotates' the computation into a realm where time is an artifact of the algorithm (if it appears at all), and uses real-time to follow the algorithm and make multiple uses of a small number of available computational facilities.
Sure.
Analog computers inherently operate in the time domain, where algorithms are irrelevant; digital computers operate in a more space-like domain, using algorithms.
But I can code a solution to a problem in Verilog, and then I can run it as hardware or software. Or I can code a problem as a schematic, and then build it or run a Spice simulation on it. I know several people say the simulation is not "real" but we actually package up spice in a package that finds solutions for our customers more quickly than if they built the hardware. Yes, the commonly used meaning of algorithm includes the control path (allowing the multiplexing you seem to think changes everything), but it also includes the data path. To wave your hands and try to convince that the algorithm is only the control path is unhelpful.

[ Reply to This | Parent | # ]

Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )