David Alan Black wrote:
When calling String#split with no arguments, it defaults to using $;,
…only if $; is non-nil (because you can’t split on nil)
which, by default, is set to nil. However, when calling String#split
with nil (including $; when it is set to nil) it raises exception.
This surprised me, which opposes the Principle of Least Surprise.
Oughtn’t String#split use $; when called with a nil first argument?
Can’t you just not provide an argument? And if you want to use $;,
you can just assign to $; (and not provide an argument).
The issue comes up when you use the second argument (the limit).
It’s hardly an inhibiting problem, but is seems an unnecessary quirk.
Other methods work by having an argument default to a value; here it’s not
the case. It acts like $;, except if $; is nil. Why not either make nil
mean ’ ', so it acts like $; all the time, or make nil mean “the behaviour
when the argument is not present”?
The former means we do string.split($;, 2) to mean “the behaviour with no
arguments but limit the array to 2 elements,” whereas the latter means we
do string.split(nil, 2). Currently we need string.split($; || ’ ', 2) (I
think), which seems silly.
On Thu, 4 Jul 2002, George Ogata wrote: