[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Why would anyone want opacity?



|   Date: Mon, 13 May 1996 15:02:07 -0400
|   From: Matthias Blume <blume@CS.Princeton.EDU>

|   Of course there are times when one needs such things.  My objection is
|   to making generic numbers the *default*.  Most of the numbers in my
|   programs are small integers, and I expect this to be true even in
|   heavy generic math packages.  We can safely assume that loop indices,
|   array subscripts, etc., are not quaternions most of the time.  And
|   most of the numbers (at least in my programs) are loop indices of some
|   sort.

Why not make it the default?  I can only imagine efficiency being the
issue, and there are ways to overcome that.  In addition, having to
detect overflow causes the same level of inefficiency.

|   If you really need to use a generic Number type it would be better to
|   define a signature for the corresponding algebraic structure (pun
|   intended) and abstract your code over this structure.

Right, and then I would have had to modify the assembler instead of
using it verbatim.  Which version is saving me work?

|
|   > One of the decisions in the design of ML that I have never figured out
|   > was the requirement that integer overflow raise an exception.
|   > This makes virtually every program dependent on the word size.
|
|   Well, you certainly put your finger on a weak spot here, but the same
|   spot is even weaker in current Scheme, because implementations are
|   only encouraged, but not required to provide bignums.  Moreover, there
|   is no standard way of defending against overflow in Scheme right now,
|   so virtually every program potentially depends on word size and
|   implementation.

Hah?  Detecting overflow is trivial, hence it is easy to shadow the
standard definitions with versions that check, if that is what you want.
The only issue is efficiency, but compilers can be trained to
recognize such patterns and use the appropriate instructions when present.

|   Moreover, the new standard basis for ML defined the structure Int32,
|   which provides 32-bit integers for those who need them.

So they've made it even more concrete instead of abstract.  It seems
like the wrong thing.