[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Why would anyone want opacity?



On Mon, 13 May 1996 15:02:07 -0400, Matthias Blume <blume@CS.Princeton.EDU> said:

> [ example of why one sometimes needs generic arithmetic deleted ]

> Of course there are times when one needs such things.  My objection
> is to making generic numbers the *default*.  Most of the numbers in
> my programs are small integers, and I expect this to be true even in
> heavy generic math packages.  We can safely assume that loop
> indices, array subscripts, etc., are not quaternions most of the
> time.  And most of the numbers (at least in my programs) are loop
> indices of some sort.

Different styles, I guess - I prefer to use lists to do my looping, so
it's not so much of an issue.  I just looked at the file of code that
I had in mind when I wrote my objection, and there wasn't a single
integer in it that I didn't want to be handle with generic arithmetic.
(For example, there were a few 2's in it, but the 2's were getting
multiplied by something that I couldn't guarantee was an integer, let
alone a small one.)

The code did use records, which might be implemented using small
integers as indices, but that's really an implementation decision, and
the reasonable solution there is for the record package implementor to
worry about efficiency issues in that package, not for me to worry
about them.

> If you really need to use a generic Number type it would be better to
> define a signature for the corresponding algebraic structure (pun
> intended) and abstract your code over this structure.

Better for whom?  My code was easy to write, expresses what I mean,
and works fast enough for what I want it to do.  I don't see what the
problem is that signatures are supposed to solve.

> From: "Guillermo J. Rozas" <gjr@martigny.ai.mit.edu>

>> One of the decisions in the design of ML that I have never figured
>> out was the requirement that integer overflow raise an exception.
>> This makes virtually every program dependent on the word size.

> Well, you certainly put your finger on a weak spot here, but the
> same spot is even weaker in current Scheme, because implementations
> are only encouraged, but not required to provide bignums.

So in Scheme you're likely to be able to use bignums, but aren't
absolutely guaranteed to be able to, whereas in ML you certainly (or
almost certainly?) can't, and this is an advantage for ML?  I think
I'm missing something here.  In both languages you can roll your own
bignums, in both languages it's probably a pain to do so, but at least
in Scheme you probably won't have to.  Given that rolling my own
bignum package isn't my idea of fun, Scheme seems like a clear win to
me.

david carlton
carlton@math.mit.edu

       Look DEEP into the OPENINGS!!  Do you see any ELVES or
       EDSELS...  or a HIGHBALL??...