Larry Wall's comments on Ruby

We may not agree with what Larry has to say about Ruby, but as usual he
says it well.

The relevant part:

"As for specifics, I must say that the example of Ruby is the main reason
I decided against implicit lexical scoping for Perl 6. We’ll be sticking
with explicit my declarations. But I have to like the majority of Ruby
simply because that’s the part that was borrowed straight out of Perl. :slight_smile:

I also liked Ruby’s unary splat operator, so I borrowed it for Perl 6.

The main problem I see with Ruby is that the Principle of Least Surprise
can lead you astray, as it did with implicit lexical scoping. The question
is, whose surprise are you pessimizing? Experts are surprised by different
things than beginners. People who are trying to grow small programs into
large programs are surprised by different things than people who design
their programs large to begin with.

For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as
the computer is concerned, and it’s even fine for experts to treat them as
objects. But premature OO is a speed bump in the novice’s onramp. "

Personally, I like Ruby’s scoping rules a lot better than Perl’s.

Also, I think that everyting being an object is actually helpful for
beginners - It’s much easier to pick up OO programming ideas if you learn
them first, I think. It also tends to make things much more consistent.

Thoughts?

Phil

Hi –

Larry Wall wrote:

The main problem I see with Ruby is that the Principle of Least Surprise
can lead you astray, as it did with implicit lexical scoping. The question
is, whose surprise are you pessimizing?

No question here. We have the answer: Matz’s.

David

···

On Sat, 7 Sep 2002, Phil Tomson wrote:


David Alan Black | Register for RubyConf 2002!
home: dblack@candle.superlink.net | November 1-3
work: blackdav@shu.edu | Seattle, WA, USA
Web: http://pirate.shu.edu/~blackdav | http://www.rubyconf.com

The main problem I see with Ruby is that the Principle of Least Surprise
can lead you astray, as it did with implicit lexical scoping. The
question
is, whose surprise are you pessimizing? Experts are surprised by
different
things than beginners. People who are trying to grow small programs into
large programs are surprised by different things than people who design
their programs large to begin with.

This may have to do with the granularity of surprise. For example, if I use
an object n a string context (e.g., puts my_obj), I would be surprised if
there wasn’t an implicit call to to_s or to_str. An I believe that is how
Perl does it; the parser “knows” the context, be it scalar, or vector, or
whatever, and does The Right Thing.

On the other hand, I tried testing some blogging code on a different box,
and was “surprised” it failed. (Turns out the code needs to know if it is
run under CGI, or mod_ruby, or command line.) But I’d be really surprised
if anyone thought this point was the responsibility of anyone but the
developer.

I’d be interested to know if anyone was surprised by something in Ruby when
they moved a small app to a large app.

The POLS is sort of an 80/20 thing. Certain things (like some_string
returning a number, not a character) surprised me, but by and large the
least surprise grows from a consistency of design principles, not from
Matz’s mood on any given day.

For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as
the computer is concerned, and it’s even fine for experts to
treat them as
objects. But premature OO is a speed bump in the novice’s onramp. "

Personally, I like Ruby’s scoping rules a lot better than Perl’s.

Also, I think that everyting being an object is actually helpful for
beginners - It’s much easier to pick up OO programming ideas if you learn
them first, I think. It also tends to make things much more consistent.

Thoughts?

I, too, think it makes more sense to present objects first, then show
concrete examples later. Because, later on, if you have various execptions
to Everything Is An Object (see Java), you have to mentally track more
stuff. So, rather than some possible brief discomfort when learning the
language, you have mild discomfort all the time (see Java).

I don’t think Lary is giving OO newcomers enough credit.

James

···

Phil

A total beginner, I think, would have the same amount of difficulty
learning OO vs. non-OO languages. For a beginner to ruby, but not to
programming, I would think that the learning curve would be strongly
background dependent.

Having recently dived into a perl project for a client after a short
vacation from perl, I’m finding that having everything OO in ruby
makes it much easier to lookup functions and operators in code and
documentation. For rarely used perl functions, I find that I have to
go poking around in the docs about twice as long to locate the
reference info I want.

···

On Sat, Sep 07, 2002 at 02:31:49AM +0900, Phil Tomson wrote:

Larry Wall On Perl, Religion, and... - Slashdot

"For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as
the computer is concerned, and it’s even fine for experts to treat them as
objects. But premature OO is a speed bump in the novice’s onramp. "

Personally, I like Ruby’s scoping rules a lot better than Perl’s.

Also, I think that everyting being an object is actually helpful for
beginners - It’s much easier to pick up OO programming ideas if you learn
them first, I think. It also tends to make things much more consistent.


Alan Chen
Digikata LLC
http://digikata.com

The main problem I see with Ruby is that the Principle of Least Surprise
can lead you astray, as it did with implicit lexical scoping. The question
is, whose surprise are you pessimizing? Experts are surprised by different
things than beginners. People who are trying to grow small programs into
large programs are surprised by different things than people who design
their programs large to begin with.

I think this whole Least Surprise Principle is a load of bullshit that
is invoked far too often for no good reason. It has a fancy name, but I
translate it to myself as “when matz made Ruby he made sure the way it
worked made sense to him”. Excuse me, isn’t it how all languages are(or
should be) made?

When you are a complete novice to computers, nothing in Ruby (or in any
other language) will be familiar to you and you will be ‘surprised’ by
everything. You just go ahead and learn how Ruby (or any other language)
works and you live with it and let Ruby be Ruby. Then you gradually
train your intuition to Ruby and everything is dandy.

When you learn Ruby after having a lot of experience with languages such
as C++/Java everything makes sense, and the Principle seemingly works.
If you only programmed in Fortran before (which is the case with a lot
of old-school physicists for example) then you will probably be even
more surprised than if you were a novice…

So face it, Ruby is just the language with its own structure, logic,
syntax and attitude. You just learn it, like every other language. It
is a good language simply because it was written by a good programmer;
and the story about the Principle of the Least Surprise is a good
anecdote for language historians but has nothing to do with any serious
discussion on advantages and disadvantages of Ruby.

He, he, I think Larry Wall really got himself into trouble this time. The
first time I learned Ruby, I didn’t use OO feature at all; I just did
“straight” procedural programming in a script:

a = ...
b = ...
c = a * b
def func (x)
....

Then finally, after I learned all the OO stuff, I got the quite pleasant
surprise, that I actually already programmed in OO since the beginning as
the simple script above is actually inside the class Object. What can be
better than this? Even in Java people have to be choked with endless
keywords and object stuff since the beginning. I would not hesitate to
say that Matz is much more genius than Larry!

Regarding the keyword “my”, my philosophy is always less typing is better
(unless we are paid by the hour :slight_smile: ). To me, the use of “@” for
instance var and nothing for local var is one of the best, if not the
best
way in designing a language. As I already wrote before, there is no
comparison between Perl and Ruby, well, … except probably for CPAN…

Regards,

Bill

···

===========================================================================
Phil Tomson ptkwt@shell1.aracnet.com wrote:

Larry Wall:

For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as
the computer is concerned, and it’s even fine for experts to treat them as
objects. But premature OO is a speed bump in the novice’s onramp. "

Larry Wall On Perl, Religion, and... - Slashdot

We may not agree with what Larry has to say about Ruby, but as usual he
says it well.

The relevant part:

"As for specifics, I must say that the example of Ruby is the main reason
I decided against implicit lexical scoping for Perl 6. We’ll be sticking
with explicit my declarations. But I have to like the majority of Ruby
simply because that’s the part that was borrowed straight out of Perl. :slight_smile:

I’m suspicious of anything Larry, or any of the top Perl brass, have to
say regarding scoping rules. Variable declaration in Perl is a
nightmare: do you use “my”, “our”, “local”, or a filehandle? Be
careful! The scoping and namespace rules change in subtle and nefarious
ways depending on the declaration. It’s even worse if you plan on using
references (or, more accurately, one of the three different types of
references), since the scoping and namespace rules change in subtle and
nefarious ways depending on the declaration.

Examples of this madness? A “local” variable isn’t really local; it’s
valid in nested subroutines as well. A top-level “my” variable isn’t
really global to the package; you have to use “our” in order to allow
that behavior (this distinction is important if you’re using symbolic
references instead of scalar references).

Perl doesn’t have the best track record regarding sane scoping rules.
Oh, I’m sure they have it all figured out this time around, and they
promise to get it right in Perl 6. As Linus says, show me the code. If
I sound bitter, it’s because I’ve been burned by this mess more than
once. That said, Ruby does have its scoping quirks (eg eval and closure
block ambiguity), most of which have been beaten to death on this list,
so I won’t bother regurgitating them here.

I also liked Ruby’s unary splat operator, so I borrowed it for Perl 6.

The main problem I see with Ruby is that the Principle of Least Surprise
can lead you astray, as it did with implicit lexical scoping. The question
is, whose surprise are you pessimizing? Experts are surprised by different
things than beginners. People who are trying to grow small programs into
large programs are surprised by different things than people who design
their programs large to begin with.

For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as

That’s simply not true. If a number is an number, then it shares a set
of common characteristics with all other numbers. Is it zero? Is it an
integer? What integer does it round to? Even non-technically inclined
people think this way; sets of things are grouped together because they
share attributes. As Andy says in another response, there’s nothing
counterintuitive or complicated about making this natural thought
process a fundamental aspect of a language. On the contrary, the
unnatural distinction between objects (thingies which possess attributes)
and primitives (thingies which are magically exempt or devoid of
attributes) is counterintuitive and confusing for beginners. Larry
should know better.

the computer is concerned, and it’s even fine for experts to treat them as
objects. But premature OO is a speed bump in the novice’s onramp. "

Personally, I like Ruby’s scoping rules a lot better than Perl’s.

Obviously I agree. :slight_smile:

Also, I think that everyting being an object is actually helpful for
beginners - It’s much easier to pick up OO programming ideas if you learn
them first, I think. It also tends to make things much more consistent.

Again, I agree.

···

Thoughts?

Phil


Paul Duncan pabs@pablotron.org pabs in #gah (OPN IRC)
http://www.pablotron.org/ OpenPGP Key ID: 0x82C29562

Larry Wall wrote:
[ snippage ]

For instance, I think it’s a violation of the Beginner’s
Principle of Least Surprise to make everything an object. To a
beginner, a number is just a number. A string is a string. They
may well be objects as far as the computer is concerned, and
it’s even fine for experts to treat them as objects. But
premature OO is a speed bump in the novice’s onramp.

Phil Tomson graced us by uttering:

Personally, I like Ruby’s scoping rules a lot better than
Perl’s.

Thoughts?

Also, I think that everyting being an object is actually
helpful for beginners - It’s much easier to pick up OO
programming ideas if you learn them first, I think. It also
tends to make things much more consistent.

After nearly two decades of coding (yes, some of you have been
doing it longer), I no longer claim to be able to twist my mind
into the “Beginner” shape. In fact, I can’t say I honestly give
a single, solitary thought to how a beginner might perceive a
given language, syntax, or construct. My primary concerns are:

  • Does it do what I need?
  • Does it do too much more than what I need?
  • Will my peers be able to understand what I’m doing?

With these criteria, both Ruby and Perl fit my bill, so long as
the “peers” reviewing the Perl code know Perl, and the “peers”
reviewing Ruby code know Ruby.

Most non-coders have trouble distinguishing between a “word” (in
the linguistic sense) and “character-based representation of a
word”, much less Functional vs. Object-Oriented programming.

So the question for Larry (IMHO) becomes: “As a language
designer, how far back in the evolution of a programmer should I
cater to?” But I’m not Larry, regardless of whether I agree with
his decision. :slight_smile:

Today I heard a user attempt to explain a problem.

  "I was typing my project and MS Word told me I'd performed an
illegal operation.  I clicked OK and the window went away and
when I opened up Word again, it wouldn't let me edit the file
or delete it.  I had to start all over."

So MS Word had (predictably) crashed, leaving a swapfile and
zombie filehandle in its wake. After (predictably) restarting
Windows, the forsaken victim of a file entry could safely be
removed. Of course these aren’t the users Larry has in mind, but
where does he draw the line, I wonder?

Do Larry’s refinements on behalf of the beginner help me?
Seldom. Do they hinder me? Seldom. Am I glad there’s a
clear-headed developer behind such a prolific and valuable
language? Always.

Perl’s an exceptionally powerful functional language wearing an
ugly, dirty, but passably OO mask.

Ruby is a clean, powerful, and elegant OO language. It’s not
perfect, and it’s not the answer to everything… but we haven’t
hit v2.x yet!

Tim Hammerquist

···


Although the Perl Slogan is There’s More Than One Way to Do It, I hesitate
to make 10 ways to do something. :slight_smile:
– Larry Wall in 9695@jpl-devvax.JPL.NASA.GOV

Larry Wall wrote:

For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object.

Guys, Google Groups sez this thread does not yet contain the phrase
“sour grapes”. Can anyone see a way to fit it in?

···


Phlip
greencheese.org - This website is for sale! - greencheese Resources and Information.
– Proud victim of the dreaded boomerang effect –

So learning to deal with numbers AND strings AND arrays AND hashes AND
objects is easier than learning to deal with objects? Cool.

Massimiliano

···

On Sat, Sep 07, 2002 at 02:31:49AM +0900, Phil Tomson wrote:

For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as
the computer is concerned, and it’s even fine for experts to treat them as
objects. But premature OO is a speed bump in the novice’s onramp. "

Erm… only my and our are declarations. local is not. And you’d use
a filehandle presumably when you wanted to read or write from a file
of some sort.

There are a few operations in perl that only act on global variables,
rather than lexical ones (notably local, symbolic refs, and formats).
Perl’s appendix–vestigial remnants of an older time. Every language
has them, and they’re generally marked as deprecated.

I’d hardly call anything in perl nefarious, though. Well, with
perhaps the exception of the source to the regex engine, but all
regex engine code is evil.

···

At 6:00 AM +0900 9/7/02, Paul Duncan wrote:

I’m suspicious of anything Larry, or any of the top Perl brass, have to
say regarding scoping rules. Variable declaration in Perl is a
nightmare: do you use “my”, “our”, “local”, or a filehandle?


Dan

--------------------------------------“it’s like this”-------------------
Dan Sugalski even samurai
dan@sidhe.org have teddy bears and even
teddy bears get drunk

Larry Wall wrote:

For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object.

Guys, Google Groups sez this thread does not yet contain the phrase
“sour grapes”. Can anyone see a way to fit it in?

Done.

James

···


Phlip
greencheese.org
– Proud victim of the dreaded boomerang effect –

Examples of this madness?

I'm agree with you, there is madness in "my" :slight_smile:

pigeon% perl
use strict;
{
    my $m;
    sub aa { eval '$m += 2' }
    sub bb { eval 'print "bb : $m\n"' }
    sub cc { $m += 3 }
    sub dd { print "dd : $m\n" }
}
for (0 .. 3) {
    aa(); bb();
    cc(); dd();
}
^D
bb : 2
dd : 3
bb : 4
dd : 6
bb : 6
dd : 9
bb : 8
dd : 12
pigeon%

Guy Decoux

Denys Usynin wrote:

I think this whole Least Surprise Principle is a load of bullshit that
is invoked far too often for no good reason. It has a fancy name, but I
translate it to myself as “when matz made Ruby he made sure the way it
worked made sense to him”. Excuse me, isn’t it how all languages are(or
should be) made?

I was able to use Ruby quite well 1 hour after downloading it.
And I used most features without reading any docs at all,
just by guessing it based on common sense considerations. (In all
other languages I know, there were always surprising details I had
to learn before using a particular feature.) This attribute of
Ruby became quite conscious before I had learn the phrase "principle
of least surprise". For me, it was not a buzzword but the label
of a concrete psychological experience I made. But, of course, this
is fairly subjective and other people may not had the same
experience, but this is no reason to tell us that it does not exist.
I could demonstrate it to my brother also: I taught him Ruby half an 

hour long, then I asked him "how would you do this or this more
conveniently? He always told the best way according to him (sometimes
it was a feature he knew from Java, sometimes it was something he
knew from Maple, sometimes just a common sense solution he thought
it was most appropriate. Most worked quite well in Ruby :))

Excuse me, isn’t it how all languages are (or should be) made?
Yes, they should be, but are they?

It is a good language simply because it was written by a good programmer;

No, it is a good language because it was written by a
good psichologist, and aesthetist.

Programming skills are “only” a guarantee that it works, but do
not say anything about the quality of the language.

Regards, Christian

Alan Chen wrote:

Larry Wall said in
Larry Wall On Perl, Religion, and... - Slashdot

"For instance, I think it’s a violation of the Beginner’s Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as
the computer is concerned, and it’s even fine for experts to treat them as
objects. But premature OO is a speed bump in the novice’s onramp. "

A total beginner, I think, would have the same amount of difficulty
learning OO vs. non-OO languages. For a beginner to ruby, but not to
programming, I would think that the learning curve would be strongly
background dependent.

Actually I have made a quite interesting experiance last year.
My wife started to study computer science. She was a total
novice: she has never written a line of code in any programming
languages. Interestingly the base course in programming featured Haskell.
Sometimes she came to me to ask questions and I had real problems,
but she she seemed to pick it up quite naturally.

The next course was about imperative programming in Pascal, and
I thought it would be really trivial for her after Haskell. It was
not the case: she has always complained that Haskell was so logical,
but Pascal is a real mess. I had to admit that she was right.
So I think the preconception that immperative programming is simple
to novices, and functional or OO was harder is simply not true. I
think it is based on the fact that most people enter the universities
or courses, have already experience in some imperative language.

Regards, Christian

Hi,

I think this whole Least Surprise Principle is a load of bullshit that
is invoked far too often for no good reason. It has a fancy name, but I
translate it to myself as “when matz made Ruby he made sure the way it
worked made sense to him”. Excuse me, isn’t it how all languages are(or
should be) made?

Although I admit the “Principle of Least Surprise” has bigger impact
than I expected, I still disagree all languages are following the
principle.

Language designers are often too near-sighted in designing his/her
language. They often focus too much on “what it can do”. But in
reality we have to focus on “how it can solve” or “how it makes you
feel in programming”. I think this is a difference.

I also admit that Ruby offers bunch of surprises when you meet it for
the first time. But once you become familiar with it, you will feel
comfortable programming in it.

Believe me, not all languages are made equal. Some (or many)
languages just don’t make you feel comfortable even after you’ve
mastered it. Their design just don’t care how you feel while you
program in.

If “POLS” is not a proper word for the concept, and you have a better
slogan for the concept, I’d glad to switch.

						matz.
···

In message “Re: Larry Wall’s comments on Ruby” on 02/09/07, Denys Usynin usynin@hep.upenn.edu writes:

POLS has enough meaning for enough people that it deserves to be considered
meaningful. The concept has been around for a long time, too.

Gavin

···

----- Original Message -----
From: “Denys Usynin” usynin@hep.upenn.edu

I think this whole Least Surprise Principle is a load of bullshit that
is invoked far too often for no good reason. It has a fancy name, but I
translate it to myself as “when matz made Ruby he made sure the way it
worked made sense to him”. Excuse me, isn’t it how all languages are(or
should be) made?

He, he, I think Larry Wall really got himself into trouble this time. The
first time I learned Ruby, I didn’t use OO feature at all; I just did
“straight” procedural programming in a script:

a = ...
b = ...
c = a * b
def func (x)
....

That’s something that’s too often forgotten. Procedural is really a subset of
OO. You can write a (high-level) C program within Java. One student I know
did it as a protest. 500 lines inside main(). Not even any functions. As far
as I know, he now works for Red Hat.

People often complain about OO for no good reason whatsoever. Larry must have
had too much sun recently for him to bark about OO in Ruby!

Then finally, after I learned all the OO stuff, I got the quite pleasant
surprise, that I actually already programmed in OO since the beginning as
the simple script above is actually inside the class Object. What can be
better than this? Even in Java people have to be choked with endless
keywords and object stuff since the beginning. I would not hesitate to
say that Matz is much more genius than Larry!

Whoa! Larry Wall is an extremely clever man. Perl is a major work of modern
art, as ugly as all the rest of modern art. This is potentially flamewar
territory (though this group usually survives those pretty well), but let me
just say politely that Ruby borrows a lot from Perl - both in positive features
to extract and negatives to leave behind.

Regarding the keyword “my”, my philosophy is always less typing is better
(unless we are paid by the hour :slight_smile: ). To me, the use of “@” for
instance var and nothing for local var is one of the best, if not the
best
way in designing a language. As I already wrote before, there is no
comparison between Perl and Ruby, well, … except probably for CPAN…

Right on. Dig it.

Regards,

Bill

–Gavin

···

----- Original Message -----
From: “William Djaja Tjokroaminata” billtj@y.glue.umd.edu

I’m suspicious of anything Larry, or any of the top Perl brass, have to
say regarding scoping rules. Variable declaration in Perl is a
nightmare: do you use “my”, “our”, “local”, or a filehandle?

Erm… only my and our are declarations. local is not. And you’d use
a filehandle presumably when you wanted to read or write from a file
of some sort.

True. I guess what I’m really trying to say is this: despite the
additional complexity and inconsistent behavior, Perl’s scoping rules
are not really that much more powerful than Ruby’s scoping rules.

The reason I tossed in file handles? I was using them to illustrate how
various types in Perl have scoping nuances which cause unexpected
behavior and give unexpected results. Case in point:

use strict; # no more implicit variables
open HeyImStillImplicit, “filename.txt”; # okay!!???
open $foo, “filename.txt”; # nope sorry

this is okay:

my $foo;
open $foo, “filename.txt”;

but this isn’t:

my FILEHANDLE;

There are a few operations in perl that only act on global variables,
rather than lexical ones (notably local, symbolic refs, and formats).
Perl’s appendix–vestigial remnants of an older time. Every language
has them, and they’re generally marked as deprecated.

Neither symbolic references nor the local keyword are marked as
depricated. To be fair, local says “you should probably be using my()”.

···

At 6:00 AM +0900 9/7/02, Paul Duncan wrote:

I’d hardly call anything in perl nefarious, though. Well, with
perhaps the exception of the source to the regex engine, but all
regex engine code is evil.

                                    Dan

--------------------------------------“it’s like this”-------------------
Dan Sugalski even samurai
dan@sidhe.org have teddy bears and even
teddy bears get drunk


Paul Duncan pabs@pablotron.org pabs in #gah (OPN IRC)
http://www.pablotron.org/ OpenPGP Key ID: 0x82C29562

See below:

%% -----Original Message-----
%% From: Christian Szegedy [mailto:szegedy@t-online.de]
%% Sent: Saturday, September 07, 2002 8:22 AM
%% To: ruby-talk ML
%% Subject: Re: Larry Wall’s comments on Ruby

%% Actually I have made a quite interesting experiance last year.
%% My wife started to study computer science. She was a total
%% novice: she has never written a line of code in any programming
%% languages. Interestingly the base course in programming featured Haskell.
%% Sometimes she came to me to ask questions and I had real problems,
%% but she she seemed to pick it up quite naturally.
%%
%% The next course was about imperative programming in Pascal, and
%% I thought it would be really trivial for her after Haskell. It was
%% not the case: she has always complained that Haskell was so logical,
%% but Pascal is a real mess. I had to admit that she was right.
%% So I think the preconception that immperative programming is simple
%% to novices, and functional or OO was harder is simply not true. I
%% think it is based on the fact that most people enter the universities
%% or courses, have already experience in some imperative language.
%%
%% Regards, Christian

This makes total sense to me: Languages that are modeled after the way
people actually think are easier for people to grasp than languages that are
modeled after the way computers are designed to manipulate bits of data.
Pascal, C, C++, and especially assembler are, for this reason, quite
counter-intuitive to someone not familiar with the way computers process
information (and the nuances in the ways different chip architectures are
designed).

At bottom, high-level languages must be implemented in the lower-level
languages, or the computer would have no idea how to interpret what the
human is asking it to do, but in terms of one’s ability to “get” the syntax
of a computer langauge, the syntax must be highly abstracted from the
language of the machine.

The pluses:
+ easy to learn
+ easy to use
+ easier to focus on the problem domain rather than on the mechanics of
information processing, thus
+ easier to get work done quickly

The minuses:
- can severely limit one’s view of how a computer actually works, thus
trapping one in a single paradigm or computational model, in much the same
way (as the saying goes) everything looks like a nail to someone who has
only a hammer in his/her toolbox. OO is in this respect sort of like the
ultimate Swiss Army Knife, but it is still just one approach of many that
might work and work well in a given problem domain.
- if one’s favorite high level language were not available, and only a
lower level language were available, one would be SOL unless one knew how to
program in that lower level language. Everyone should know at least one
low level language well enough to get by in a crunch.

Generally speaking, the pluses of high level languages like Ruby outweigh
the minuses for most applications, since availability of high level
languages is not a serious issue in light of the open source software
movement, the Internet, and the decreasing cost of hardware.

Sincerely,

Bob Calco