if what you mean by defined is that storage is reserved for a symbol, as
oposed to declaring a symbol, then isn’t the above exactly what you suggest?
eg.
@sum = @smu + 1
can be tracked down precisely because @smu has not been defined?
my interpretation of ‘define’ is something along the lines of
appears as an lvalue
appears in a ‘def’ statement
appears as symbol in attr :symbol
etc.
in fact, i don’t think there is a way to declare, without defining, a symbol
in ruby? does anyone know otherwise?
that being the case
while trying to define @smu using
@sum = @smu + 1
the error can be tracked down because @smu has not been defined.
personally i would not like to see mandatory declarations as part of ruby,
though optional ones might at times be usefull :
eg.
a wanna-be declaration
def foo; end
puts foo
def foo
‘foo’
end
i’m no language designer by a long shot, but i think this might slow the
interpreter down if this worked by default? if not, i think it would be a
nice feature… right now the only work around (to maintain readability) is
to put foo into a module and do
require ‘foo’
puts foo
which is o.k. most of the time, but sometimes a real optional ‘declaration’ would be
nice too
perhaps a new block method like
declare {
class Forward
def method oneArg, twoArgs, threeArgs, four
end
end
}
could force two passes of the interpreter?
i’m guessing matz has good reason for having be a one pass interpreter though…
-a
···
On Fri, 18 Oct 2002, Kontra, Gergely wrote:
The current behavior is a useful way to catch programming errors such
as
@sum = @smu + 1
This can be tracked down if all fields must be explicitly defined.
–
====================================
Ara Howard
NOAA Forecast Systems Laboratory
Information and Technology Services
Data Systems Group
R/FST 325 Broadway
Boulder, CO 80305-3328
Email: ahoward@fsl.noaa.gov
Phone: 303-497-7238
Fax: 303-497-7259
====================================