Duck typing alows true polymorfisim

<> wrote in message

lets say you want a generic numerical algorithom like sum


def sum lst
lst.inject(0){|total,current| total*current}

Java // i dont know if there is a numeric super class for numbers

[snip- int only example]

There is, and it is interesting because it hilights the
real difference: Java uses C style numerics, where
you select the model of arithmetic you want using
the type system. Here's a 'generic sum' for Java:

public static double sum(Iterable<? extends Number> list)
   double total=0;
   for(Number n : list)
   return total;

If I got it right (I'm a bit rusty with Java) that will sum
any collection containing any kind of number.

But you have to specify that you want *double* arithmetic,
as you see; had I chosen int, it would produce different

In some languages- like Ruby I think- you cannot make
this choice. Your language chooses for you. Most languages
that do this prefer accuracy to speed, so everything gets
promoted to larger types on demand, and things like
rounding and overflow are avoided.

However, even if you agree with this choice, you can
still get into trouble. Promoting to 'double' or 'float'
imposes different errors that you'd get with ints, but
they still exist. Sometimes a ratio type is better; other
times you would prefer a decimal type.

The greatest advantage of static type systems is to
expose design decisions like these in a way the
compiler can see.