Undefine

For the sake of it (fixed this time, I hope):

def test_1()
  p defined? x
  x = "hello" if defined? x
  p x, defined? x
end
test_1() # => nil, "hello", "local-variable"

def test_2()
  p defined? x
  if defined? x then
    x = "hello"
  elsif defined? x then
    x = "world"
  end
  p x, defined? x
end
test_2() # => nil, "world", "local-variable"

According to the definition of "defined" implemented by
defined?(), it appears that in fact a local variable is
not defined until a statement where it gets assigned gets
its chance to be executed. That happens at runtime.

However, at parse time (or eval time), the compiler determines
what is that statement that will potentially introduces a
local variable if given a chance to be executed.

This is a rather contrived definition and I wonder why
local variables are not simply defined at method level
(and begin/end level). There should be enough information
at parse time to do that I guess. Ruby scoping rule
is, to me, an usual surprising mix of static/dynamic
scoping. Other languages I know either do static xor
dynamic scoping in the similar case. A pure static
scoping rule is usually more efficient because it makes
it possible to compute at compile time the number of
"slots" that need to be allocated, once only, at method
invocation time.

This is a very minor thing and I would be surprised
to see some example of code taking advantage of that
unusual (to me) behavior of defined?()

Thank you to those who flagged mistakes in my previous
post.

Yours,

JeanHuguesRobert

···

On Jun 15, 2004, at 10:19 AM, tony summerfelt wrote:
I think the whole point here is that it isn't Ruby-ish to rely on a variable being defined or not. Ruby wassn't built to do things in that way; a couple examples in this thread show how variables are automagically defined when code is parsed. They aren't defined at runtime, they are defined at eval time.

-------------------------------------------------------------------------
Web: @jhr is virteal, virtually real
Phone: +33 (0) 4 92 27 74 17

I think the whole point here is that it isn't Ruby-ish to rely on a variable being defined or not. Ruby wassn't built to do things in that way; a couple examples in this thread show how variables are automagically defined when code is parsed. They aren't defined at runtime, they are defined at eval time.

For the sake of it (fixed this time, I hope):

<snip proof>

According to the definition of "defined" implemented by
defined?(), it appears that in fact a local variable is
not defined until a statement where it gets assigned gets
its chance to be executed. That happens at runtime.

hmm. I *knew* that. I'm not sure why I was thinking that when I wrote that out... severe lapse of memory, I suppose.

Relatedly, it's always been a (minor) pet peeve of mine that code like this:

   print a while a = gets

doesn't work unless you have defined 'a' beforehand; you get a NameError otherwise.

However, at parse time (or eval time), the compiler determines
what is that statement that will potentially introduces a
local variable if given a chance to be executed.

which is what I got it confused with. I really must sleep more often :slight_smile:

This is a rather contrived definition and I wonder why
local variables are not simply defined at method level
(and begin/end level). There should be enough information
at parse time to do that I guess. Ruby scoping rule
is, to me, an usual surprising mix of static/dynamic
scoping. Other languages I know either do static xor
dynamic scoping in the similar case. A pure static
scoping rule is usually more efficient because it makes
it possible to compute at compile time the number of
"slots" that need to be allocated, once only, at method
invocation time.

Yes, I wonder: since Ruby already goes through all identifiers and flags whether they are variables or methods (at eval time), would it be very difficult to make the variables defined at the moment they come into scope? I think this would make the code I posted above work nicely, the way I would expect.

This is a very minor thing and I would be surprised
to see some example of code taking advantage of that
unusual (to me) behavior of defined?()

I agree. Right now, the only major use for defined?() that I can see is to make *sure* you don't get any NameErrors when you are doing something strange.

cheers,
Mark

···

On Jun 15, 2004, at 10:24 PM, Jean-Hugues ROBERT wrote:

On Jun 15, 2004, at 10:19 AM, tony summerfelt wrote: