>For instance, I think it's a violation of the Beginner's Principle of
>Least Surprise to make everything an object. To a beginner, a number is
>just a number. A string is a string. They may well be objects as far as
>the computer is concerned, and it's even fine for experts to treat them as
>objects. But premature OO is a speed bump in the novice's onramp. "
I think Larry’s off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow “different” and should
be segregated and not taught to beginners.
I think that’s a load of crap; exposing native types as non-objects
in Java, for instance, is to me one the largest failings of that language.
I thinks that’s why so many people have trouble mastering OO concepts:
because it’s taught as a sort of an add-on to already entrenched
procedural, linear thinking. It’s been my experience that you get
better mileage is you start with objects right out of the gate – which
is exactly what Dave and I did in the pickaxe book.
To the best of my knowledge, not one reviewer or fan letter has yet
to criticize that decision. Far from it – every bit of feedback we’ve
gotten so far has been uniformly positive that we started right off
with objects, without excuses.
>They may well be objects as far as
>the computer is concerned
Actually, I think that’s backwards – I don’t care what the
computer thinks about these things. By the time it hits the
CPU it sure as hell isn’t an object anymore. I want to
to think of it as an object, and I want the other programmers
on my team to think of it as an object as well.
Author of “The Pragmatic Programmer” * “Programming Ruby” * The Agile Manifesto
Columnist for IEEE Software Magazine * Board of Directors, Agile Alliance
Pragmatic T-shirts available at: www.pragmaticprogrammer.com/merchandise.html
No. In a nutshell he had some good things to say about Ruby.
Common, this is Larry Wall.
···
-----Original Message-----
From: Andrew Hunt [mailto:andy@toolshed.com]
Sent: Friday, September 06, 2002 10:48 AM
To: ruby-talk ML
Subject: Re: Larry Wall’s comments on Ruby
Larry Wall is reputed to have said:
>For instance, I think it's a violation of the Beginner's Principle of
>Least Surprise to make everything an object. To a beginner, a number is
>just a number. A string is a string. They may well be objects as far as
>the computer is concerned, and it's even fine for experts to treat them
as
>objects. But premature OO is a speed bump in the novice’s onramp. "
I think Larry’s off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow “different” and should
be segregated and not taught to beginners.
I think that’s a load of crap; exposing native types as non-objects
in Java, for instance, is to me one the largest failings of that language.
I thinks that’s why so many people have trouble mastering OO concepts:
because it’s taught as a sort of an add-on to already entrenched
procedural, linear thinking. It’s been my experience that you get
better mileage is you start with objects right out of the gate – which
is exactly what Dave and I did in the pickaxe book.
To the best of my knowledge, not one reviewer or fan letter has yet
to criticize that decision. Far from it – every bit of feedback we’ve
gotten so far has been uniformly positive that we started right off
with objects, without excuses.
>They may well be objects as far as
>the computer is concerned
Actually, I think that’s backwards – I don’t care what the
computer thinks about these things. By the time it hits the
CPU it sure as hell isn’t an object anymore. I want to
to think of it as an object, and I want the other programmers
on my team to think of it as an object as well.
Author of “The Pragmatic Programmer” * “Programming Ruby” * The Agile
Manifesto
Columnist for IEEE Software Magazine * Board of Directors, Agile Alliance
Pragmatic T-shirts available at: www.pragmaticprogrammer.com/merchandise.html
>For instance, I think it's a violation of the Beginner's Principle of
>Least Surprise to make everything an object. To a beginner, a number is
>just a number. A string is a string. They may well be objects as far as
>the computer is concerned, and it's even fine for experts to treat them as
>objects. But premature OO is a speed bump in the novice's onramp. "
I think Larry’s off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow “different” and should
be segregated and not taught to beginners.
Which, geven the time of his rise to ascendency would be expected. We
are all a product of “our times”. While complexity theory has been
around since the 70’s (some would say off and on since the time of the
Egyptians) but was not widely known outside of very tight academic
circles before the mid-90’s. Thus, Larry can be forgiven for thinking
the concept of “objects” too difficult for the beginner.
I think that’s a load of crap; exposing native types as non-objects
in Java, for instance, is to me one the largest failings of that language.
Unfortunately, it is worse than just a “mere load of crap”; it’s a time
bomb. I know through a contact one large brand (not at liberty to say
who) had to replace their Java-based e-commerce solution in order to
efficiently scale up. (My contact solved their problem with perl/MySQL.)
I suspect, as time unfolds many more such issues will arise, especially
regarding maintainability.
I thinks that’s why so many people have trouble mastering OO concepts:
because it’s taught as a sort of an add-on to already entrenched
procedural, linear thinking. It’s been my experience that you get
better mileage is you start with objects right out of the gate – which
is exactly what Dave and I did in the pickaxe book.
OO isn’t taught well IMO. “Pickaxe” is a rare exception. The irony is
that Ruby is closely modelled in accordance with contemporary thinking
in modern theoretical physics where everything -is- an object (in
concept, not necessarily name). The physical world in which we live is
de facto Rubyesque.
>For instance, I think it's a violation of the Beginner's Principle of
>Least Surprise to make everything an object. To a beginner, a number is
>just a number. A string is a string. They may well be objects as far as
>the computer is concerned, and it's even fine for experts to treat them as
>objects. But premature OO is a speed bump in the novice's onramp. "
I think Larry’s off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow “different” and should
be segregated and not taught to beginners.
I think that’s a load of crap; exposing native types as non-objects
in Java, for instance, is to me one the largest failings of that language.
I thinks that’s why so many people have trouble mastering OO concepts:
because it’s taught as a sort of an add-on to already entrenched
procedural, linear thinking. It’s been my experience that you get
better mileage is you start with objects right out of the gate – which
is exactly what Dave and I did in the pickaxe book.
I agree completely,I think Ruby might be the best language to learn as
your first computer language, with it’s relatively forgiving syntax and
consistently object oriented logic.
after all, isn’t the whole purpose of OO programming to let the person
think as a human instead of trying to think as a machine?
the way programming is taught right now, first you wrestle your brain
into thinking in terms of computer logic so that you can code in C, and
then wrestle it back into using human logic again when you learn OOP in
say C++. Such an unnatural process.
I don’t think Larry’s off his rocker. In fact Andy’s words say otherwise:
"I thinks that's why so many people have
trouble mastering OO concepts:
because it's taught as a sort of an add-on
to already entrenched procedural, ..."
OO is still not taught as the norm. Thus beginner’s will be surprised by
everythings-an-object, regardless of what Ruby has done. The existence of
Ruby with ‘OO everywhere’ won’t change this.
So Larry was correct. That’s not to say that the world needs to accede to
this pedagogy, but it is a positive thing to recognize reality, even if you
decide not to change.
Once the world comes to an agreement that everything should be an object
(and I’m not convinced we will), then it will eventually be taught that way.
Till then, beginners will continue to be surprised by Ruby.
My first professional language in 1985 was an object-oriented LISP. I was
quite suprised by it. I don’t think things have changed that much since
then. Except that OO is a more commonly taught additional technique.
I think Larry’s off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow “different” and should
be segregated and not taught to beginners.
I think that’s a load of crap; exposing native types as non-objects
in Java, for instance, is to me one the largest failings of that language.
I thinks that’s why so many people have trouble mastering OO concepts:
because it’s taught as a sort of an add-on to already entrenched
procedural, linear thinking. It’s been my experience that you get
better mileage is you start with objects right out of the gate – which
is exactly what Dave and I did in the pickaxe book.
Hmm, this comment reminds me of when I took Sun’s “Java Programming
Language” couse a couple of years ago. It was advertised as being for C
and C++ programmers, and on the first day the instructor spent a good
deal of time making sure everybody knew the prerequisites were no joke.
Unsurprisingly, the lessons made frequent references to C and C++
concepts, which were mostly over my head. But with my Python background,
I had very little trouble with the parts of the course that were
actually about Java (can’t say the same for all my classmates, many of
whom did have the prerequisites).
···
On Sat, Sep 07, 2002 at 02:48:18AM +0900, Andrew Hunt wrote:
I thinks that’s why so many people have trouble mastering OO concepts:
because it’s taught as a sort of an add-on to already entrenched
procedural, linear thinking. It’s been my experience that you get
better mileage is you start with objects right out of the gate – which
is exactly what Dave and I did in the pickaxe book.
>For instance, I think it's a violation of the Beginner's Principle of
>Least Surprise to make everything an object. To a beginner, a number is
>just a number. A string is a string. They may well be objects as far as
>the computer is concerned, and it's even fine for experts to treat them as
>objects. But premature OO is a speed bump in the novice's onramp. "
I think Larry’s off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow “different” and should
be segregated and not taught to beginners.
This part jumped out to me too. To beginner (say my gf), 1 is a
number, and “This is a couple of things” is a quoted sentence.
Strings are things that our cats play with.
When he says Beginner’s Principle of Least Surprise, he seems to mean
programmers who learned C first. That, to me, is a very strange
definition of a ‘novice’.
Consistency is far more important than familiarity.
But Ruby’s consistency gave into familiarity in certain
(what I consider) fairly significant areas.
For example, we only have “everything’s an object” behavior
for some math intrinsics and the interface for files is a bit
strange, e.g., I expected 5.cos or aFile.rename(‘toThisName’),
not Math.cos(5) and File.rename(aFile,‘toThisName’).
OO isn’t taught well IMO. “Pickaxe” is a rare exception. The irony is
that Ruby is closely modelled in accordance with contemporary thinking
in modern theoretical physics where everything -is- an object (in
Actually, I am curious. Do you have something more specific in mind,
or was this just mend to be an admittedly very clever line?
concept, not necessarily name). The physical world in which we live is
de facto Rubyesque.
I like Ruby a lot but I find Ruby’s single method receiver
OO-incarnation (unfortunately the OO-norm) models badly
at least the mathematical world - the ingenious but in the end
hackish and ``non-OO’’ coerce framework of the Numeric
class hierarchy is an unfortunate proof to my point…
When he says Beginner’s Principle of Least Surprise, he seems to mean
programmers who learned C first. That, to me, is a very strange
definition of a ‘novice’.
One might consider “teachability” as an alternate metric, since the task
of communicating programming language concepts to a given target
audience can at least be estimated. (And personally, I’d rank Python
above Ruby according to that metric; but both way above Perl. The
idea of designing a curriculum to teach Perl to novice programmers
scares me, to be honest.)
I think Larry’s off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow “different” and should
be segregated and not taught to beginners.
Which, geven the time of his rise to ascendency would be expected. We
are all a product of “our times”. While complexity theory has been
around since the 70’s (some would say off and on since the time of the
Egyptians) but was not widely known outside of very tight academic
circles before the mid-90’s. Thus, Larry can be forgiven for thinking
the concept of “objects” too difficult for the beginner.
I wonder whether you know what complexity theory is about…
Consistency is far more important than familiarity.
But Ruby’s consistency gave into familiarity in certain
(what I consider) fairly significant areas.
For example, we only have “everything’s an object” behavior
for some math intrinsics and the interface for files is a bit
strange, e.g., I expected 5.cos or aFile.rename(‘toThisName’),
not Math.cos(5) and File.rename(aFile,‘toThisName’).
The methods you can call on numbers are intended, IMO, to represent the things
that numbers have in common. Operations like cos() make sense as pure
functions.
aFile.rename(‘tothisname’), while it looks appealing, doesn’t make sense.
aFile is a file object - it doesn’t have a name (see for yourself).
Directories map names to files. A file can be known by many names. If Ruby
has a class FileName, then we could do what you want. (Java’s File class is
about file names, not about files).
(I might be wrong
–
Bil
Gavin
···
----- Original Message -----
From: “Bil Kleb” W.L.Kleb@larc.nasa.gov
To: “ruby-talk ML” ruby-talk@ruby-lang.org
Sent: Saturday, September 07, 2002 8:01 PM
Subject: Re: Larry Wall’s comments on Ruby
For example, we only have “everything’s an object” behavior
for some math intrinsics and the interface for files is a bit
strange, e.g., I expected 5.cos
This one’s been discussed a lot before.
…or aFile.rename(‘toThisName’),
I think the way to look at this is to think “who” it is that does the
renaming. In the case of a file, it’s effectively the filesystem that does
(eg in UNIX, it involves changing the contents of a directory file).
I think it makes more sense to ask the class (ie, the thing that logically
knows about files) to do it … hence, I feel this case isn’t surprising.
Which, geven the time of his rise to ascendency would be expected. We
are all a product of “our times”. While complexity theory has been
around since the 70’s (some would say off and on since the time of the
Egyptians) but was not widely known outside of very tight academic
circles before the mid-90’s. Thus, Larry can be forgiven for thinking
the concept of “objects” too difficult for the beginner.
To clear up the discussion here, I believe that Kent is refering to
complexity theory as the problem of understanding and managing large
systems, i.e. how write a big computer program: how to divide it up into
chunks and how those chunks interact.
Complexity Theory, to a computer scientist, is about understanding how
algorithms scale: if I give my sorting function an array that’s 10 times
longer to sort does it take 10 times as long or 10*10 = 100 times as
long? It tells you why QuickSort is better than BubbleSort if you’re
sorting more than a few objects. Much effort is spent both
understanding algorithms and designing new ones that scale better.
Unsurprisingly, the lessons made frequent references to C and C++
concepts
I used to have to put up with lessons on how to do 2s complement
arithmetic and the hexadecimal numbering system just so that I could be
taught Pascal which kept you away from all that stuff. At the time I was
teaching myself Lisp on a ZX Spectrum.
For example, we only have “everything’s an object” behavior
for some math intrinsics and the interface for files is a bit
strange, e.g., I expected 5.cos or aFile.rename(‘toThisName’),
not Math.cos(5) and File.rename(aFile,‘toThisName’).
Why? - once I get used to {}.inverse, 5.abs or 5.0.finite? I find
zero, zip, nil reasons for expecting why 5.0.inverse, 5.0.sin or
5.prime? would work differently?
The methods you can call on numbers are intended, IMO, to represent the things
that numbers have in common. Operations like cos() make sense as pure
functions.
Really? A look at ``complex.rb’’ might change your opinion.
Here are the relavant bits
···
module Math
…
# alias old versions
alias sinh! sinh
alias cosh! cosh
alias sin! sin
...
def sin(z)
if Complex.generic?(z)
sin!(z)
else
Complex(sin!(z.real)*cosh!(z.image),
cos!(z.real)*sinh!(z.image))
end
...
end
If ``sin’’ was more consistently implemented as an instance
method of the buildin classes Float, Fixnum and this be
would be a
class Complex
…
def sin
Complex(real.sin * image.cosh, real.cos * image.sinh)
end
end
If you ever wanted to add another Numerical class with a sensible sin'' method you need put another level of indirection into the Math module_function sin’’ - none of this is very OO’ish
ihmo.
I’m curious – why would you rate Python above Ruby for
“teachability”?
I’ve been a programmer for a long time, and I’ve got about a dozen
languages under my belt – and I “got” Ruby quickly. I still can’t
“get” Python. Some of it, of course, might have to be with what I
consider to be the utterly stupid rule of enforced formatting taking
the place of explicit blocks (as Larry said, a paragraph should
have a beginning, a middle, and an end – so should blocks).
I realise that I’m coming from a different point of view than your
stated goal (teaching novice programmrs), but for the life of me, I
can’t see why Python would be easier to learn than Ruby. (After all,
isn’t Python’s OO bolted-on, too?)
-austin
– Austin Ziegler, austin@halostatue.ca on 2002.09.07 at 20.34.46
···
On Sat, 7 Sep 2002 14:21:22 +0900, Reimer Behrends wrote:
One might consider “teachability” as an alternate metric, since
the task of communicating programming language concepts to a given
target audience can at least be estimated. (And personally, I’d
rank Python above Ruby according to that metric; but both way
above Perl. The idea of designing a curriculum to teach Perl to
novice programmers scares me, to be honest.)
Totally agree. It’s the same case as with the 5.cos
doesn’t make sense because cos(), sin(), etc…are not
behaviour of a number in regular sense. Those are
trigonometry functions which applied on those numbers.
For example, we only have “everything’s an object”
behavior
for some math intrinsics and the interface for
files is a bit
strange, e.g., I expected 5.cos
This one’s been discussed a lot before.
…or aFile.rename(‘toThisName’),
I think the way to look at this is to think “who” it
is that does the
renaming. In the case of a file, it’s effectively
the filesystem that does
(eg in UNIX, it involves changing the contents of a
directory file).
I think it makes more sense to ask the class (ie,
the thing that logically
knows about files) to do it … hence, I feel this
case isn’t surprising.