Charles Nutter wrote in post #975363:
>>> > things
>>> > I actually feel are broken about Ruby as it is today.
>>>
>>> I'm sorry to take this slightly off topic, but I just can't let this
>>> comment pass. I am currently building a language called Quby, which is
>>> a very Ruby-like language, that runs 100% in the browser (about 4,000
>>> lines of JS). There are differences (as it's Ruby-like not Ruby) with
>>> the main one being that it's tied to a canvas. But AFAIK it's the
>>> closest pure JS implementation of Ruby (although it's Quby, not Ruby).
>
> I have to ask...does an applet not give you what you want? JRuby does
> run fine in an applet, and with recent improvements to the Java
> browser plugin it can do anything a Flash or other language plugin can
> do (traverse and manipulate DOM, etc).
>
> - Charlie
[snip]
I've built lots of
applets in the past and although they can work they are far from being
as reliable as Flash or HTML 5. Java penetration is still pretty bad,
only about 60% of users have Java 6 which is over now over 5 years old.
Aside from this, the browser already has a perfectly good language and runtime
already running (JavaScript), so adding another one can't be good for
performance.
I will definitely consider applets when I can actually dictate the client to
some extent -- for instance, right now I'm taking a physics class which
requires Java, Flash, _and_ Silverlight (moonlight doesn't _quite_ work). If I
were writing software with that much freedom to declare "You WILL have browser
plugin X", then sure.
For the moment, however, my strategy has been to keep the client as simple as
I reasonably can, and try to actually learn JavaScript as well as I know Ruby.
JavaScript isn't a bad language, it's just ugly.
>> First lots of small things I don't such as nil, hash comments and Ruby's
>> block comments.
>
> This is _exactly_ what I'm talking about. I already know how to write
> Ruby
> with things like nil and hash comments. I don't see why I should have to
> learn
> an entirely new syntax, and teach my editor an entirely new syntax, for
> such
> trivialities as replacing nil with null.
>
> This just shouldn't factor into it.
It's not entirely new syntax, it's mostly the same (or very similar).
Maybe "entirely new" was the wrong word, but consider: I _like_ Ruby syntax.
Occasionally, I'm forced to write C, even C++. But I'm still not going to do
something like this:
#define self (*this)
But that's a fair point as on the surface it does look like "change for
the sake of change". I'd argue far more people have heard of null rather
then nil, I bet more languages use C style single-line comments over
hash comments, and Ruby block comments are really ugly and almost
unusable.
Lots of Rubyists, Perlists, Pythonistas, and shell ninjas disagree.
So I have changed items for common well known syntax rather
then just random stuff.
But see, the problem is that rather than closely tracking an existing
language, you're building a hybrid of both. In other words, your new language
is really only going to be comfortable to those fluent in at least Java and
Ruby, if not more languages. I like to consider myself fluent in both Java and
Ruby, and it'd _still_ be problematic for me, as I'd have to constantly be
wondering whether I should be following the Ruby or Java convention.
I guess my real question is, why'd you go with a new language rather than
trying to implement an existing one? Why Quby instead of Ruby-on-JavaScript?
If the reason was to add this compile-time verification and make these
language tweaks, I guess I understand, and don't really have much more to say
about it other than that I strongly disagree.
···
On Monday, January 17, 2011 05:20:10 am Joseph Lenton wrote:
> On Sat, Jan 15, 2011 at 4:45 PM, David Masover <ninja@slaphack.com> > > > > wrote: