[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Why would anyone want opacity?



|   Date: Mon, 13 May 1996 12:22:29 -0400 (EDT)
|   From: david carlton <carlton@math.mit.edu>
|   Cc: ziggy@martigny.ai.mit.edu, rrrs-authors@martigny.ai.mit.edu,
|	   carlton@math.mit.edu
|   References: <199605102034.QAA19760@fs.CS.Princeton.EDU>
|	   <199605102215.SAA29584@zayin.CS.Princeton.EDU>
|
|   On Fri, 10 May 1996 18:15:44 -0400, Matthias Blume <blume@CS.Princeton.EDU> said:
|
|   > But then: why generic arithmetic?  I know, it's kinda cute.  But is
|   > it worth it?  Don't we know most of the time in advance that a given
|   > variable will always hold a small integer?  Or a real?  Is the
|   > trouble of automatically injecting into and projecting from a
|   > universal number type (with the associated run-time checks) really
|   > worth it?
|
|   For me, lots of time, it is.  I'm a mathematician, and these days most
|   of the time that I'm programming I'm implementing algorithms that
|   would work largely or entirely without change over much larger classes
|   of "numbers" than rationals or reals or whatever.  And even when I'm
|   dealing with just rationals or integers, having arbitrary precision
|   arithmetic is absolutely essential.  And when I say "would work" in
|   the above, I'm not talking about some sort of abstract possibility; I
|   mean that it happens all the time that I wish that I could run some of
|   my code in a more generic number system, that I didn't have to go
|   running to Maple (which I can't stand) every time that I want to allow
|   the possibility of dealing with polynomials, say, rather than rational

Or writing assemblers for compilers.  When Jim Miller and I ported the
MIT Scheme compiler to the DEC Alpha, the difference in word-size in
the assembler was hidden from us.

In the cross-compiler, running on a 32-bit machine, bignums were being
manipulated all the time.

On the self-hosted version, running on a 64-bit machine, fixnums were
being manipulated all the time.

One of the decisions in the design of ML that I have never figured out
was the requirement that integer overflow raise an exception.
This makes virtually every program dependent on the word size.