Larry Wall's comments on Ruby

Harry Ohlsen wrote:

I think the way to look at this is to think “who” it is that does the
renaming. In the case of a file, it’s effectively the filesystem that
does

Very fair - but that doesn’t necessarily mean that the filesystem should
be implemented as class methods on the File class either. I’ve used a
“FileService” interface which encapsulates the filesystem behaviours. It’s
possible to implement most of a FileService over ftp, or DAV, for example,
which is quite nice. I’ve even remoted a FileService into another user’s
context to operate with alternate permissions. I don’t know what would
have been the best way to do this in Ruby, perhaps as a mixin?

···


Clifford Heath

Albert Wagner wrote:

< snip>

I wonder whether you know what complexity theory is about…

I thought I knew, but maybe not. Tell us. A link would do.

Regards, Christian

The complexity theory deals with (mainly asymptotic) runtime
a space efficiency of algorithms and networks.
It is not about design considerations in complex programs.

http://www.complexitytheory.com/

It’s just a course overview, but you can get some impressions.

The part you snipped was:

“Which, geven the time of his rise to ascendency would be expected. We
are all a product of “our times”. While complexity theory has been
around since the 70’s (some would say off and on since the time of the
Egyptians) but was not widely known outside of very tight academic
circles before the mid-90’s. Thus, Larry can be forgiven for thinking
the concept of “objects” too difficult for the beginner.”

At least I don’t see how Complexity Theory fits into this sentence
unless it was meant sarcastically.

Regards, Christian

“Gavin Sinclair” wrote

For example, we only have “everything’s an object” behavior
for some math intrinsics and the interface for files is a bit
strange, e.g., I expected 5.cos or aFile.rename(‘toThisName’),
not Math.cos(5) and File.rename(aFile,‘toThisName’).

5.cos, 5.arcsin, 5.tenthroot, 5.inverse, 5.negative, 5.prime?

It has to stop somewhere.

Why? - once I get used to {}.inverse, 5.abs or 5.0.finite? I find
zero, zip, nil reasons for expecting why 5.0.inverse, 5.0.sin or
5.prime? would work differently?

The methods you can call on numbers are intended, IMO, to represent the
things
that numbers have in common. Operations like cos() make sense as pure
functions.

My take on this is that we do things like sin(x)
rather than x.sin out of sheer mathematical habit,
not based on any compelling logic.

Hal

···

----- Original Message -----
From: “Christoph” chr_news@gmx.net
Newsgroups: comp.lang.ruby
To: “ruby-talk ML” ruby-talk@ruby-lang.org
Sent: Saturday, September 07, 2002 2:02 PM
Subject: Re: Larry Wall’s comments on Ruby

Austin:

%% I realise that I’m coming from a different point of view than your
%% stated goal (teaching novice programmrs), but for the life of me, I
%% can’t see why Python would be easier to learn than Ruby. (After all,
%% isn’t Python’s OO bolted-on, too?)

No, it isn’t bolted on a la Perl OO. For reasons that I leave to
psychologists to discern, Guido Van Rossum chose to implement OO in Python
from the get-go such that:

* There is no real encapsulation - all methods and data are public
* every class method requires the explicit "self" first parameter
* etc.

Python was always OO insofar as its implementation of OO can be called OO.
Python’s big claim to fame was that infernal indentation syntax. I suppose
dumping encapsulation, imposing code formatting, and making a visual
distinction between a class def and a regular old def by requiring “self” as
the first parameter (and other features) are supposed to make the learning
curve for beginners easier, but I the actual value of these "features"
toward that end is suspect, to put it diplomatically.

– Bob Calco

%%
%% -austin
%% – Austin Ziegler, austin@halostatue.ca on 2002.09.07 at 20.34.46
%%
%%
%%

An interesting comment, since a paragraph doesn’t
have a specific end token, but is delimited only
by formatting. :slight_smile:

Hal

···

----- Original Message -----
From: “Austin Ziegler” austin@halostatue.ca
To: “ruby-talk ML” ruby-talk@ruby-lang.org
Sent: Saturday, September 07, 2002 7:40 PM
Subject: Re: Larry Wall’s comments on Ruby

(as Larry said, a paragraph should
have a beginning, a middle, and an end – so should blocks).

Christian Szegedy wrote:

The complexity theory deals with (mainly asymptotic) runtime
a space efficiency of algorithms and networks.
It is not about design considerations in complex programs.

http://www.complexitytheory.com/

It’s just a course overview, but you can get some impressions.

This looks like a rather specific application of complexity theory.
(Hence the “computional” in the title, I take it.)

I wouldn’t be surprised to see complexity theory (the general kind, not
the one posted a link to here) applied to designing large programs,
since complexity theory is used in a multitude of disciplines. (Economy,
weather forcasting, organizations etc)

The part you snipped was:

“Which, geven the time of his rise to ascendency would be expected. We
are all a product of “our times”. While complexity theory has been
around since the 70’s (some would say off and on since the time of the
Egyptians) but was not widely known outside of very tight academic
circles before the mid-90’s. Thus, Larry can be forgiven for thinking
the concept of “objects” too difficult for the beginner.”

At least I don’t see how Complexity Theory fits into this sentence
unless it was meant sarcastically.

I agree on this though. Object oriented programming seems like a much
more structured and orderly with its encapsulation of behaviour and
data. Too much so as to have roots in complexity theory, which deals
more with emerging behaviour not obvious from the individual parts (such
as the chaos theory part of it).

Complexity theory might shine light on problems we face in large OO
systems, but it sure ain’t no prequisite for appreciating the wonderful
logic of OO that seems to be traceable back to Plato :slight_smile:

···


([ Kent Dahl ]/)_ ~ [ http://www.stud.ntnu.no/~kentda/ ]/~
))_student
/(( _d L b_/ NTNU - graduate engineering - 5. year )
( __õ|õ// ) )Industrial economics and technological management(
_
/ö____/ (_engineering.discipline=Computer::Technology)

Thank you for your reply and the link. I am incompetent to argue the point.
My computer education has been completely informal. However, from my
perspective, I don’t see in the overview at that link anything that
precludes the principles from applying to a complex program. I suppose that
there is either another “complexity theory” or the one that you pointed me to
has been bastardized in the pop science community. My understanding was that
OOP technologies grew out of some sort of studies in complexity, and were an
attempt to deal with that complexity.

Somewhat related: I saw a graph some years ago that impacted on me. The y
axis was randomicity and x was complexity. It was to illustrate that (a) if
randomicity was sufficiently high then statistical methods were useful and
(b) if randomicity was very low and complexity was low then standard
mechanical methods were useful. What I found startling was the large area in
the middle and to the right that were not random enough for statistics to be
reliable and too complex for mechanical methods. Yet, it was just in this
area, where no reliable tools currently existed that most of the interesting
and important problems of our time existed.

···

On Saturday 07 September 2002 12:42 pm, Christian Szegedy wrote:

Albert Wagner wrote:

< snip>

I wonder whether you know what complexity theory is about…

I thought I knew, but maybe not. Tell us. A link would do.

Regards, Christian

The complexity theory deals with (mainly asymptotic) runtime
a space efficiency of algorithms and networks.
It is not about design considerations in complex programs.

http://www.complexitytheory.com/

It’s just a course overview, but you can get some impressions.

The part you snipped was:

“Which, geven the time of his rise to ascendency would be expected. We
are all a product of “our times”. While complexity theory has been
around since the 70’s (some would say off and on since the time of the
Egyptians) but was not widely known outside of very tight academic
circles before the mid-90’s. Thus, Larry can be forgiven for thinking
the concept of “objects” too difficult for the beginner.”

At least I don’t see how Complexity Theory fits into this sentence
unless it was meant sarcastically.

Regards, Christian

The complexity theory deals with (mainly asymptotic) runtime
a space efficiency of algorithms and networks.
It is not about design considerations in complex programs.

http://www.complexitytheory.com/

It’s just a course overview, but you can get some impressions.

There are some good links at the Santa Fe Institute:

http://www.santafe.edu/~shalizi/notebooks/complexity.html

James

I wrote, paraphrasing something from the /. Larry Wall interview:

(as Larry said, a paragraph should have a beginning, a middle,
and an end – so should blocks).
An interesting comment, since a paragraph doesn’t have a specific
end token, but is delimited only by formatting. :slight_smile:

This is true, but only to a degree. If one is using block format
(modified or otherwise), then it is more proper to say that a
paragraph has no beginning token, but it has a clear end token
(typically a bit of extra space between paragraphs, along with a
ragged edge final line). If one is using “traditional” format, then
it is proper to say that a paragraph doesn’t have an ending mark,
but does have a beginning mark (the indentation of the first line).

However, what Larry was referring to is that in good analytical
writing, one writes paragraphs such that there is a beginning to the
argument, the argument itself, and a close to the argument addressed
by the paragraph. I personally think that Ruby’s approach (which is
mildly reminiscent of Pascal and Ada on this matter) is cleaner than
either Python’s (which is reminiscent of Fortran and COBOL) or
Perl’s (which is also that of C/C++/Java/etc.) in that the block is
cleanly delineated by meaningful tokens instead of symbols or
spaces.

def fun(foo)

end

to me seems to be much cleaner than

fun(foo):

or

sub fun(foo)
{

}

even though they ultimately mean the same thing. From a computer’s
perspective, it’s far easier to track scope with single-character
symbols (and it helps with editor paren-matching, too); IMO, from a
human’s perspective it’s far easier to track scope with meaningful
tokens than merely by space (especially since a single space error
can cause compile problems). Of course, judicious use of formatting
can make all of the things easier to read for the human.

-austin
– Austin Ziegler, austin@halostatue.ca on 2002.09.08 at 00.34.23

···

On Sun, 8 Sep 2002 12:50:25 +0900, Hal E. Fulton wrote:

Hal E. Fulton wrote:

···

----- Original Message -----
From: “Austin Ziegler” austin@halostatue.ca
To: “ruby-talk ML” ruby-talk@ruby-lang.org
Sent: Saturday, September 07, 2002 7:40 PM
Subject: Re: Larry Wall’s comments on Ruby

(as Larry said, a paragraph should
have a beginning, a middle, and an end – so should blocks).

An interesting comment, since a paragraph doesn’t
have a specific end token, but is delimited only
by formatting. :slight_smile:

Oho, untrue :slight_smile: The full stop (or question or exclamation mark or
ellipsis …) sure enough the paragraph has the same delimiter(s) as
a sentence, but so does Ruby (end)


Giuseppe “Oblomov” Bilotta

Axiom I of the Giuseppe Bilotta
theory of IT:
Anything is better than MS

“Hal E. Fulton” wrote

My take on this is that we do things like sin(x)
rather than x.sin out of sheer mathematical habit,
not based on any compelling logic.

It is just that the deliberate non-OO-ness'' of sin(x) etc., makes these methods harder to use, and in particular reuse, when basically every thing else follows the OO pragmatism - e.g. `complex.rb' essentially breaks the solution in [ruby-talk:37162] for being able to write x.sin’'.

/Christoph

Python was always OO insofar as its implementation of OO can be
called OO.

Would that this were so. However, it’s not; and no amount of claiming
that it was will change history. In the words of the Python FAQ:

6.7. Why must ‘self’ be declared and used explicitly in method
definitions and calls?

…When classes were added to Python, this was (again) the simplest
way of implementing methods without too many changes to the
interpreter.

[The FAQ then proceeds to spend a couple paragraphs attempting to
rationalize why this is a feature, not a bug]

Python’s OO is bolted on, and it shows. It’s true that it was bolted
on at a much younger stage than Perl’s, long before it came to massive
popularity. But as the Python FAQ admits, the reason you have to hold
the compiler’s hand in Python by explicitly declaring the ‘self’
variable is was not a conscious design choice, as some have later
come to claim. It is that way because that was the easiest way to
kludge OO support in, and no one ever got around to fixing it. This
also the reason that Pythons types and classes were not unified until
very recently, and why old Python code uses the procedural
‘String.foo(mystring)’ functions rather than the more OO
‘mystring.foo()’ methods.

The strongest argument for this design is that since Python doesn’t
have variable declarations (like Ruby), you need a way to tell the
compiler you are creating an instance variable rather than a local
variable. A closely related argument is that it’s an aid to the
reader to have instance variables clearly marked wherever they are
used. Ruby neatly solves this with the ‘@foo’ notation. Python,
eschewing “funny characters”, could still have solved this by making
‘self’ a keyword and requiring all uses of instance variables use the
‘self.foo’ notation.

Don’t tell any of this to a loyal Python user though, if you value
your hide. One of the reasons I prefer both Perl and Ruby over Python
for most tasks is just this - the community. Perl users - and Larry
himself - are perfectly happy to admit that Perl’s OO is a bolted-on
hack (albeit a clever one). And Ruby was OO from inception. On the
other hand the Python crowd would, in my experience, prefer to explain
to you at length why it’s your definition of OO that’s flawed, rather
than the language.

~Avdi

I completely agree with Hal. OO, no matter how wonderful it is and how
much it has helped us, is not the solution or model of
everything. Mathematics, for whatever reason, has adopted the function
model and not the object model. I personally prefer “abs(x)” to “x.abs”
when I am dealing with math, but “5.times” is fine, as it is dealing with
a programming construct.

Regards,

Bill

···

============================================================================
Hal E. Fulton hal9000@hypermetrics.com wrote:

My take on this is that we do things like sin(x)
rather than x.sin out of sheer mathematical habit,
not based on any compelling logic.

Hal

Albert Wagner wrote:

precludes the principles from applying to a complex program. I suppose that
there is either another “complexity theory” or the one that you pointed me to
has been bastardized in the pop science community.

I am a (discrete) mathematician. My brother happens to be a
professor of complexity theory at the Rutgers University.
The complexity theory, I pointed to, has its origin in the 1970-s.
It started with the recognition of the clear distinction between the NP
and exponential complexity classes. It is a very serious and highly
fruitful branch of computer science. One could say that this is
where the main focus of computer science lies. There is nothing
“bastardized” about it.

My understanding was that
OOP technologies grew out of some sort of studies in complexity, and were an
attempt to deal with that complexity.

I don’t know about any reason for OO techniques than human psychology.

From a mathematical point of view there is no principal difference
between Ruby and a Turing machine and malbolge. (It is only the
user-friendliness.)

Somewhat related: I saw a graph some years ago that impacted on me. The y
axis was randomicity and x was complexity. It was to illustrate that (a) if
randomicity was sufficiently high then statistical methods were useful and
(b) if randomicity was very low and complexity was low then standard
mechanical methods were useful. What I found startling was the large area in
the middle and to the right that were not random enough for statistics to be
reliable and too complex for mechanical methods. Yet, it was just in this
area, where no reliable tools currently existed that most of the interesting
and important problems of our time existed.

Sound like pseudoscience to me…

Regards, Christian

Yes, I agree…

I even rather like the way some languages “modify”
the end token: end def, end while, end if, etc.

This certainly prevents the inane comments to the same
effect that people put in, and even gives an additional
hint to the interpreter. (It would prevent the mistake
that I often make where I leave out an ‘end’ and the
parser isn’t able to catch it until 50 lines later.)

My understanding is that Ruby used to have these
(before I ever saw it), but they conflicted with the
“modifier” forms of if, until, unless, while, etc.
Can’t really have both, unless you want some heavy-duty
parser impact.

One idea I’ve wondered about is making the parser “smart”
enough to (at least) speculate about where an error really
happened, based on indentation.

This would be a small tip of the hat to Python, I suppose.
If an ‘end’ was indented differently from its corresponding
beginning, a warning would be generated. But then it would
interfere with people’s styles, and there would always be
some people who would mix spaces and tabs in creative and
irritating ways and set the tabstops to some setting that
the interpreter could never hope to guess.

It would work for me, though. :slight_smile: Maybe I should create a
Ruby version of lint (rint?).

Hal

···

----- Original Message -----
From: “Austin Ziegler” austin@halostatue.ca
To: “ruby-talk ML” ruby-talk@ruby-lang.org
Sent: Saturday, September 07, 2002 11:46 PM
Subject: Re: Larry Wall’s comments on Ruby

I personally think that Ruby’s approach (which is
mildly reminiscent of Pascal and Ada on this matter) is cleaner than
either Python’s (which is reminiscent of Fortran and COBOL) or
Perl’s (which is also that of C/C++/Java/etc.) in that the block is
cleanly delineated by meaningful tokens instead of symbols or
spaces.

def fun(foo)

end

to me seems to be much cleaner than

fun(foo):

or

sub fun(foo)
{

}

even though they ultimately mean the same thing. From a computer’s
perspective, it’s far easier to track scope with single-character
symbols (and it helps with editor paren-matching, too); IMO, from a
human’s perspective it’s far easier to track scope with meaningful
tokens than merely by space (especially since a single space error
can cause compile problems). Of course, judicious use of formatting
can make all of the things easier to read for the human.

(snip)

I agree with you to the extent that I’m able. (I know
little about Python.)

I wonder if we could paraphrase the above quote by
saying, “A horse is a flying animal insofar as its
implementation of flying can be called flying.” :slight_smile:

Hal

···

----- Original Message -----
From: “Avdi B. Grimm” avdi@avdi.org
To: “ruby-talk ML” ruby-talk@ruby-lang.org
Sent: Monday, September 09, 2002 8:10 PM
Subject: Re: Larry Wall’s comments on Ruby

Python was always OO insofar as its implementation of OO can be
called OO.

Would that this were so. However, it’s not; and no amount of claiming
that it was will change history. In the words of the Python FAQ:

Austin Ziegler austin@halostatue.ca wrote in message news:20020908044555.JCUL3718.tomts17-srv.bellnexxia.net@hogwarts

def fun(foo)

end

to me seems to be much cleaner than

fun(foo):

or

sub fun(foo)
{

}

whereas the latter will come up again with blocks.
(Haven’t you ever seen this in Ruby? :

       }
     }
   }
 }

}
)

When learning Python I first had problems with indentation,
as different editors handle tab and spaces different.
At least I got my jedit configured using ‘spaces as tabs’
and found a vi option set useful for Python-writings.

I found reading Python sources as easy as reading
formatted prosa text, without unnecessary end- or brace-chains
at the end.

BUT … :

even though they ultimately mean the same thing. From a computer’s
perspective, it’s far easier to track scope with single-character
symbols (and it helps with editor paren-matching, too); IMO, from a
human’s perspective it’s far easier to track scope with meaningful
tokens than merely by space (especially since a single space error
can cause compile problems).

… at least I detected that I want to do text formatting
and indentation how and when I need it.

This was when I almost decided to use Python as my one-and-only
language and looked to different use cases.
Oho! I learned that the typical Unix-Shell-One-Liners are
very hard, if not impossible. (So I was attracted by ruby -e and
companion options).
I learned, that there exists indeed an embedded Python
(like php, eperl) but works only with some special syntax
add-ons and twists, because in this case you need to declare
where the end of a block is. (So I was attracted by eruby).

My result: Ruby is the better choice for one-for-all.

And regarding teaching to novices:
Seeing that today in germany many teens learn the first
steps in programming just in the school (and not in
universitary context), it would be great to see that they
get familiar very early with the basic principles, also
that a block has a beginning and an end, like in the most
other languages they will encounter later.

(marginal note: I would prefer using script languages to
teach 'em programming, to have them learn basic subjects
from the first ‘if’ until ‘use of tcp-sockets’ instead
of only teaching a ‘language’ without principles or concepts.
Unfortunately in our schools the CS-lessons are given by
arithmetic teachers, which learned writing Java, and they
only teach their pupils writing Java.)

Bye

Dirk Detering -aka Det

“William Djaja Tjokroaminata” wrote in

I completely agree with Hal. OO, no matter how wonderful it is and how
much it has helped us, is not the solution or model of
everything. Mathematics, for whatever reason, has adopted the function

If my memory serves me right, the author of a introductory
set theory book (I went through a longer time ago then
I wish to remember;-) argued that a notation like
(x)f'' or (x)(f * g) probably would have been a better fit with the standard left to right Latin writing system - in other words by adopting 5.cos’’
Ruby would finally correct a historical mistake of epic
proportion;-)

model and not the object model. I personally prefer “abs(x)” to “x.abs”
when I am dealing with math, but “5.times” is fine, as it is dealing with

Actually I don’t really mind so much writing abs(x) or sin(x)
but I just cannot get over the ``Math’’ prefixes Math.cos(x),
Math.sin(x) … they are reaaally tacky.

In the absence of Namespaces (which are a much better
solution then the proposed Numeric mixins) things would
not be that bad if Math was at least a class (with disabled
new/allocate) - at least I could write

class MyMathNameSpace < Math; end

class << MyMathNameSpace
def sin_squared(x)
sin(x) * sin(x)
end
end

/Christoph

Christian Szegedy wrote:

Albert Wagner wrote:

precludes the principles from applying to a complex program. I
suppose that there is either another “complexity theory” or the one
that you pointed me to has been bastardized in the pop science community.

I am a (discrete) mathematician. My brother happens to be a
professor of complexity theory at the Rutgers University.
The complexity theory, I pointed to, has its origin in the 1970-s.
It started with the recognition of the clear distinction between the NP
and exponential complexity classes. It is a very serious and highly
fruitful branch of computer science. One could say that this is
where the main focus of computer science lies. There is nothing
“bastardized” about it.

Sorry, I’ve misinterpreted your sentence about
“bastardized in the pop science…”.
Of course, you must be true…

Just to make an impression how important the computaion
complexity theory is: its main problem (P!=NP) leads
the list of seven most prestigious and important unsolved
mathematical problems. If you prove it, you get 1 million
bucks. In fact this conjecture is very fundamental: almost
all results in computational complexity theory is based on it,
still it is unproven. I don’t know of any other part of
the mathematics which is based on an unproven conjecture.

http://www.claymath.org/prizeproblems/

Best regards, Christian

Really? Turing machines don’t focus interactions with its environment
but Ruby does that (I don’t know malbolge case :). For example, how
can I implement a http server (or echo server) in TM?

IMHO, mathematics which deals with interaction looks quite different
from the number theory or recursion theory.

sorry for off topic,

– Gotoken

···

At Sun, 8 Sep 2002 04:31:06 +0900, Christian Szegedy wrote:

I don’t know about any reason for OO techniques than human psychology.

From a mathematical point of view there is no principal difference
between Ruby and a Turing machine and malbolge. (It is only the
user-friendliness.)