[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

sentiments



I believe I must explain to you why I am as upset about internal-define
as I am.  This report has been a long time in coming and as it approaches
its final form, it begins to look quite splendid.  So much so that I
believe that its potential for generating converts will be quite
substantial.  That was, in fact, one of its goals.  However, we Schemer's
have had the freedom for several years of writing code for papers and
books in just about any dialect we felt the audience could/would follow.
Scheme has been a fabulous tool for that purpose.  The simplicity of the
language and the elegance of its definition attracted us and I am sure
many of you towards it.  How simple was it?  In the presence of macro
expansion, we needed the following 4 special forms:  "quote", "lambda",
"if", and "set!".  Many implementations began to exist on micros.  One
of my students, Mark Meyer, implemented an entire Scheme system including
engines, macros, a compiler, the run time support in 5k bytes on a
IBM PC.  He did this as an undergraduate in the 5 weeks following my
undergraduate programming languages course.  Now it was the case that
"set!" assigned to the closest lexical "rib" and we assumed that the
bottom rib had all known identifers assigned to either "unbound" or,
in the case of +, to some value.  This really was simple.  With this view,
define was merely an alias for set!.  However, define's value was its
first argument and set!'s value was its second argument.

This is how things were prior to the Brandeis meeting.  As I recall the
feeling at the Brandeis meeting most of us were willing to go along with
the idea of limiting the use of "define" internally to the semantics of
SI&CP.  This was in the interest of good feeling about not undermining
their book.  I hope everyone understands that I do not want to undermine
their book, although I would like to see them rewrite it without internal
defines.  However, when I went along with this view I had underestimated
how damaging this decision would be to the characterization of Scheme being
a simple language.  I was expecting a "comment" in the report that stated
that "use of define within the text of a program would be restricted to
the use as given in SI&CP."  Instead, the definition of "lambda" got changed.
Instead, begin must be added to the list of special forms.  Instead the
macro for "letrec" must be conscious of it by having the body be 
"(let () <body>)" where it should be <body>.  Instead define must be added to
the list of special forms.  Instead the semantics of "set!" are weakened
so that it is not possible to just get by with "set!" and ignore "define".
Instead we must do something "special" with "lambda", "named-lambda",
"let", "let*", and "letrec".

We have taken a rather elegant language and made it elitest.  I wanted
all of my students to be able to implement "Scheme" when they walked out
of my course, now that is no longer possible.  It was a great beauty of
Scheme that the four mentioned special forms along with identifiers and
application were the only syntax.  It was wonderful that a CPS interpreter
for Scheme was all that was necessary to come to grips with in order to
understand the run-time architecture of Scheme.  

My argument with internal-define is not that it is good or bad, but
that the subtlety of its definition is unnecessary with judicious use
of letrec and letrec should be a trivial macro.  What happened?

I have never liked internal defines.  I thought they were harmless
until I saw what havoc they introduced to the Report.  I am trying
desperately to convince everyone that we made a mistake and we should
do everything in our power before we go public on this Report to
wait until we impose internal defines on everyone.

					Dan