Hi Rubyists,
I have a question which is not directly about ruby but about other
language (but from ruby's view).
I may have to post to other newsgroups.
So if you think this posting is inappropriate, forgive me.
I understand how ruby implements value types (Fixnum, TrueClass,
FalseClass, NilClass) and reference types (other classes).
To separate them, the last 2 bits (or 3?) of variables are used as
flags (integer is 31-bit instead of 32-bit).
And I think that's a really genuis design.
I wanted to compare that design to JavaScript.
In JavaScript, numbers(8 bytes), booleans, null, and undefined are
immediate values and all others are references.
I want to know how they are handled in memory (like variable size, bit
flags, etc.) especially how differently they are implemented from
ruby's implementation.
I couldn't find a good document about it.
Yesterday, I posted the question to comp.lang.javascript but nobody
seems to care so far.
So I'm begging some help to rubyists who also understand JavaScript
well.
If possible, I also want to hear about python.
I read some postings regarding this issue in comp.lang.python, it seems
like everything in python is reference type unlike ruby.
Is it true?
I'm nervous that somebody might throw me a stone for this off-topic
posting.
I hesitated a lot before posting.
But I thought this was the best to get the answer.
Thanks.
Sam