Robert Klemme schrieb:
Or, in other words: if the decision to unify Symbol and String would
have been taken at early stages of Ruby development, then the
general usage would have adapted to this, and ...
we might be happier with the result today.I am in no way unhappy with the way it is today. Strings and symbols serve different purposes although there is some overlap. I rarely feel the need to convert between the two.
I see.
And I am quite surprised. Because judging from your online activity
you seem to have some experience.
Perhaps it is also my programming style: I may use symbols where one
normally would use strings.
I am not aware of a situation where you would need to mix them as hash keys. And to make the distinction is pretty easy most of the time IMHO.
Not aware? I mean Rails mixes them, right?
Frankly, I believe there is an inherent advantage that you can use symbols vs. strings in code. And I mean not only performance wise but also readability wise.
Readability-wise: precisely what advantage?
The only thing that comes to my mind just now, is
that a separated Symbol class easily provides
distinct special values for a parameter that would normally carry a String.
Note though, that all these issues have nothing to do with the question
whether String and Symbol should be connected inheritance wise. IMHO that's mostly an implementation decision in Ruby.
Yes, I agree.
I am actually interested in the implications for the programmer.
My original question just arised out of the notion
that this implementation decision could have been a move
in a (to my mind) favourable direction.
Yes, I sometimes think of that separation of Symbol from String
as a tiny impurity in the Ruby crystal.Personally I believe it creates more expressiveness. If you view this as impurity, there are a lot of them in Ruby because Ruby's focus
has always been on pragmatism and not purity
1. The core structure must of course be large enough, and a large structure may look impure.
2. But regarding this particular question: My original notion was that keeping
Symbol and String too separate is not pragmatic.
(I may change my mind on that, if I read more posts like yours, though.)
So, I'll just have to come to terms with it.
(And I will, of course -- there are enough other fascinating issues...)
The capability to adjust to reality is a useful one IMHO.
Well, yes, sometimes I'm glad someone tells me that.
create a class hierarchy similar to the Float/Integer hierarchy?
String < Stringlike
Symbol < StringlikeWhy not? StringLike could even be a module that relies solely on and length to do all the non mutating stuff.
Ah, interesting. Can't follow the implications right now.
Given the fact that I don't mix symbols and strings as Hash keys I wouldn't benefit -
but it would not hurt me either.YMMV
Yes that was the idea behind it: to benefit some and not to hurt the others.
Credits also go to the community that is still among the most civilized online communities I know so far!
Indeed, I'm experiencing it right now!
Thanks a lot!
Sven
···
On 15.05.2007 03:07, enduro wrote: