|
Authored by: bprice on Sunday, September 02 2012 @ 07:41 AM EDT |
(S)He also referred to "the programmers who design
CPUs".
The tone of this statement — dismissive, at best
— hints that you dispute that I exist. I am a programmer; I started in
1961. Based on my experience and demonstrated aptitudes, I was assigned to the
design team for a major computer design/implementation – for some time at
the beginning, I was the only member of that team. Later, I was joined by other
programmers and some hardware design specialists. I designed the architecture
and major portions of the instruction set of the CPU, did about 20% of the logic
of the CPU, and wrote major portions of some of the compilers and of the
operating system. It was commercially quite successful, too.
I also designed
some minor hardware, and influenced the design of several other CPUs, before
returning to programming full time. Oh, yes – I've done some teaching and
some research, too, along with some national and international standards
work.
Yes, emmenjay, the set of "the programmers who design CPUs" is
not empty, despite your apparent disbelief. My history, however, is not
relevant to what follows.
I'm not sure that I have the energy to
educate a reader from zero to competence in Computer Science. nor do I expect
the reader to have the patience to read such an epic, if I wrote
it.
Writing such a magnum opus would not be necessary, nor
fruitful. Much of your audience here is already educated in Computer Science.
Some of us appear to have a better education in CompSci than you have
demonstrated thus far.
I understand well the drain on one's energy that
illness can bring about. The a priori probabiity that I'm alive is less
than 2.25% — in the past ten years, I've survived two episodes, each of
which with an a priori mortality of 85%; I'm still recovering from the
second – my energy level is not up to snuff either. Thus, I address the
actual issues raised, rather than dancing around or threatening to write some
epic work (even if I felt myself competent to compose one).
A more productive
use of your time and energy would be to address the points raised by PolR,
several anonymice, and myself (did I leave anyone out? If I did, I apologize.).
PolR has done good formal-level expositions that a CompSci instructor should
be able to handle: I don't know anything about PolR's education and experience,
but his work has been quite impressive. My comments have been more global and
informal, addressing common misconceptions about mathematics and software,
misconceptions that can lead the undereducated to the false appearance of
software and mathematics as disjoint subjects. Some anonymice have addressed
topics at a lower, more hackerish hardware level. You have
addressed none of them, at any level.
Your refusal to address the issues
– the inclusion of programming within mathematics – has led to the
conclusion (on some contributors' parts) that you have recognized defeat, but
refuse to accept it. I observe signs that may indicate the presence of the
Dunning-Kruger effect. Perhaps I should apologize for being so blunt, but
Dunning-Kruger is what I see, up to the present. --- --Bill. NAL:
question the answers, especially mine. [ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Sunday, September 02 2012 @ 08:00 PM EDT |
I would dispute that my argument was "demolished". The poster to
whom I was replying made several statements indicating no familiarity with the
subject matter.
e.g. All computer storage ... is nothing more than
a mathematical function with its output being fed back into its
input.
Did you never learn about logic synthesis, Karnaugh mapping,
Quine-McCluskey, ...?
(S)He also referred to "the programmers who
design CPUs".
Have you ever designed a CPU? Unless it is hard-wired
(such as a Cray), its machine level instructions are comprised of addresses in
micro-programmed ROM. The contents of that ROM dictate the flow of data in the
logic circuits such that for a given address location in the ROM (corresponding
to its machine "op code"), outputs provide controls to latches,
increment/decrement modes, device triggers, and in some cases data to be
latched. Programming this microcode is little different than programming at the
machine operations level.
Even when ASICs are employed (or more rarely,
FPGAs), the logic of the processor is still "programmed" into the device -- and
the people who design these devices appropriately may be termed
"programmers".
I'm not sure that I have the energy to educate a
reader from zero to competence in Computer Science. nor do I expect the reader
to have the patience to read such an epic, if I wrote it.
Your
premature dismissiveness is both presumptuous and misguided. While a resume is a
poor substitute for sound argument, I received my BS degree in
Electrical/Electronic Engineering nearly 30 years ago, have designed and built
CPUs out nothing but TTL gates, modeled CPUs and state machines using APL and
VHDL, designed CPU-based systems for the space program, and generally spent time
using dozens -- if not hundreds -- of different processors, platforms, and
programming languages.
Hardly. I graduated in Computer Science, but
did a variety of Maths and Engineering (Elec, Mech, Civil and Geology) subjects
on the way. (Long boring story why. -- I'll spare you that). ...
If
you wish, I will support you in applying for a tuition refund from your alma
mater. :)
... I have worked for 23 years as a Software Engineer and
have taught Software Engineering at University and at a Technical
College.
I weep for our children.
[ Reply to This | Parent | # ]
|
|
|
|
|