I think one of the hardest things to explain/comprehend
about symbols is that the *only* semantics associated
with a symbol is that there is a one to one mapping from a symbol
object to a sequence of characters. It is this isomorphism that
is the useful (and *only*) property of symbols.
The sequence of characters can be specified by
a literal symbol identifier (e.g. :red, :white, :"navy blue"), in which
case the mapping to a symbol reference is done by the parser, or the
sequence of characters can be specified by an instance of a String
("red") and mapped to a symbol reference via the String#to_sym method
("red".to_sym)
This is entirely analogous to how integer literals are handled.
A sequence of characters (ascii 51, ascii 53) is mapped by the
parser to a reference to an object that has the properties of the number 35.
You can also take an instance of String ("35") and have String#to_i
map it to a reference of an object that has the properties of the number 35
("35".to_i). That doesn't mean that the string "35" *is* the number 35, just that
there is an isomorphism from a sequence of digits to references to integers.
I've been careful to say 'reference to an object' and not 'object' because
the values stored in variables or constants are references to objects, not
the objects themselves. For some classes, such as Fixnum, Float, or Symbol, the
objects are implicit and Ruby derives them from the reference itself when needed
and for other objects, say String or Array, the reference is used to locate the
object in memory.
With integers, the referenced object associated with the sequence of digits
has nice numerical properties but symbols have no properties
other than their property of being uniquely identifiable.
So when your intent is to reference objects with distinct identities but no
other properties, Symbols are the way to go. You could use integers
of course, they also are unique and easily represented by
a sequence of characters, but which is more meaningful?
flavors = [1,2,3]
flavors = [:chocolate, :vanilla, :strawberry]
I think symbols get confused with methods, and classes, and instance variables
because the concept of identity is just as useful to an application programmer
as to a language implementor (that and the PickAxe incorrectly says they are
directly related). Ruby just happens to expose the identity of various
language internals using the same mechanism (symbols) as is available to the
application programmer. No one gets confused when an Array reports its size
using the integer 5 even though the programmer may have also used the number
5 to limit the size of a zipcode field. Similarly the idea that Ruby might
use the symbol identified by :size to uniquely identify an instance method
in the class Array shouldn't be confused with a programmer using the same
symbol to identify a method in a class they have defined. The symbol is the
same in both cases but the context is different so there is no ambiguity.
ยทยทยท
On Dec 29, 2005, at 9:38 PM, Chad Perrin wrote:
I disagree. If the underlying semantic mechanism of a symbol isn't
understood, symbols won't be understood.