Design Philosophies (was Re: Different Behavior of Class Variables w.r.t ||=)

This is an interesting question, analogous to the forced use of white space
in Python or the absence of pointers in Java. The philosophy seems to be
that it’s better to force certain behavior for a developer’s own good.

If you haven’t been using the -w switch all along, Setting RUBYOPT=-w is
akin to pulling back your couch after not cleaning for several months. You
find all sorts of icky things. And, as with your couch, you have to wonder
how much of the ick is actually desirable.

When you see a list of warnings for a given script, how often do you say,
“Yeah, I know, I meant that”? Do people here have programmatic idioms
that, while producing warnings, are justified?

I’m not fond of restricting programmer choice; there should be enough rope
to shoot oneself in the foot. But so long as warnings can easily be turned
off, what are the arguments for not forcing people to look under the couch
all the time? What do you gain by avoiding warnings by default?

James

···

-----Original Message-----
From: Gavin Sinclair [mailto:gsinclair@soyabean.com.au]
Sent: Friday, October 11, 2002 8:07 PM
To: ruby-talk ML
Subject: Re: Different Behavior of Class Variables w.r.t ||=

From: " JamesBritt" james@jamesbritt.com

I think Ruby should have them on by default, and use a
command-line switch to suppress them.

Where do you put the switch if you’ve set a file as executable
and run it
without specifying the interpreter?

Fair enough. In Unix, of course, that’s a non-issue. But
there’s a way around
that as well…

Why not just set your RUBYOPT value to include warnings?

…why not set RUBYOPT to suppress them.

My point is not for myself; I always use -w. Always :slight_smile: My
point is for
other people.

Why not just set your RUBYOPT value to include warnings?

…why not set RUBYOPT to suppress them.

This is an interesting question, analogous to the forced use of white space
in Python or the absence of pointers in Java. The philosophy seems to be
that it’s better to force certain behavior for a developer’s own good.

Except that we’re only talking about warnings here. They can be ignored.

If you haven’t been using the -w switch all along, Setting RUBYOPT=-w is
akin to pulling back your couch after not cleaning for several months. You
find all sorts of icky things. And, as with your couch, you have to wonder
how much of the ick is actually desirable.

When you see a list of warnings for a given script, how often do you say,
“Yeah, I know, I meant that”? Do people here have programmatic idioms
that, while producing warnings, are justified?

My experience of C/C++ was that I couldn’t possibly write a working program
without taking notice of warnings. Thus I accepted the received wisdom (in
programming books) to treat warnings as errors. It’s less of an issue with
Ruby, but I think it’s a good habit.

So my “idiom” is to modify code so that it doesn’t produce warnings. Of course
I “know” that some warnings are harmless (method/constant redefined), but I
have no objection to changing the code.

I’m not fond of restricting programmer choice; there should be enough rope
to shoot oneself in the foot. But so long as warnings can easily be turned
off, what are the arguments for not forcing people to look under the couch
all the time? What do you gain by avoiding warnings by default?

You gain more questions to ruby-talk! :slight_smile:

James

Gavin

···

From: " JamesBritt" james@jamesbritt.com