Thank you very much for the excellent answers, and your well reasoned
responses to what could have easily been dismissed as someone "not
getting it" or attempting to start a flame war.
I have quoted various aspects of the responses and added my response or
further comments.
-- May I then ask what made you consider Ruby in the first place?
The reason I'm considering it is because I don't want to blindly choose
Java just because it's the default.
As for why Ruby and not Erlang, Scala, Groovy etc -- the honest answer
is because Ruby is getting so much attention these days, to the point of
religious fervor amongst many I speak to that I need to take an
objective look at it and what it does well.
I greatly dislike religious wars amongst technology - for example Linux,
Mac and Windows - and thus want to understand the objective
benefits/drawbacks as opposed to personal taste.
-- "seeing how well Ruby can pretend to be Java"
My intention is not to see how Ruby can pretend to be Java.
I'm using Java as the point of comparison for a few reasons:
- it's what I have the most expertise in
- it's generally the "default" choice in the types of projects and
development teams I lead
- the language has a very wide range of understanding in the development
world and is therefore a good point of reference to discuss from
- I need a valid point of reference for objective performance
comparisons
That being said, I am trying to figure out what the "Ruby way" is -
which so far is far from clear to me.
I appreciate the reference to the Design Patterns in Ruby book. That is
very much the type of recommendation that will probably help me out, so
thank you.
-- You won't really get a taste for what we mean until you start writing
-- Ruby (real Ruby, not Java ported line-by-line to Ruby).
What example opensource projects can you refer me to which espouse the
"real Ruby" style of doing things?
I'd prefer non-Rails projects, as I understand the completely different
approach of webapp dev with Rails.
I'm looking specifically at Ruby.
I keep getting told that I must understand the "Ruby way" - so I'd
appreciate instruction on how to accomplish the "Ruby way" considering I
am apparently boxed in as a "Java/C style programmer" ... despite
disliking C
-- But if you are in a position to be able to throw more hardware at the
problem,
-- it does really become a question of CPU time vs programmer time. That
-- is, if Ruby really does cost 3x the CPU of Java, you can calculate in
real
-- dollars how much it will cost to use.
If scalability was the only issue, this would be a valid response.
For example, if both Java and Ruby both performed single threaded
transactions at 150ms each, and both scaled to 10 concurrent threads
equally well, but Java continues to scale to 30 concurrent threads and
Ruby does not, then that's a scenario where I can add 3 machines to
scale Ruby horizontally and truly argue that the cost of the hardware is
more than made up for by lower developer costs.
But, "per request" performance does not get improved by this type of
solution.
Adding faster hardware does not make Ruby catch up to Java - since Java
also improves with faster hardware.
This is why the "add hardware" answer doesn't win me over on the
performance issue, because performance and scalability are two
completely different problems. I haven't even begun to test scalability
with Ruby yet.
-- Response time is only part of the story. What you really want to
benchmark is
-- requests per second, and that's not always as simple as multiplying
response time.
That's correct ... but supports my point. Requests per second is the
throughput, or scalability - not performance.
That is something I can throw hardware at - performance is not.
-- Which is pretty much going to give you benchmark candy. If your site
is
-- slowing down, that's a bug. Once the speed of the site is acceptable,
-- and you're set up to handle spikes appropriately, more performance
doesn't
-- really buy you anything other than "because you can".
I disagree. If I can cut 30% of the transaction time off of a search
engine request - that is valuable.
It provides a better use experience to the user and (according to Google
and Amazon) increases their usage of the system.
Performance of response (not talking about scalability here but actual
performance) is more than just "bragging rights" or "benchmark candy".
The speed at which an application responds to an end users request
impacts the overall usability of an application.
It is for this same reason that things such as network compression,
network optimization (CDNs, Akamai route acceleration etc) and client
side caching also all play a role.
In the presentation layer however, I tend to think the performance
degradation of using Ruby is far less of an issue than backend services,
since IO does play such a huge role - which is more or less what
Thoughtworks has come to conclude from their use of it based on their
reported experiences.
-- premature optimization is the root of all evil
I 100% agree. Martin Fowler comes to mind or someone similar.
-- My preference would be, if I can write 97% of the program in Ruby,
and
-- 3% in C, is that really going to be less pleasant than writing 100%
of the
-- program in Java?
An interesting observation and one I must consider.
-- when was the last time the type system saved you?
This is a valid and interesting question.
I would suggest that it's not that it is "saving" anything - cause there
is nothing to save once the application is running, because the code
can't be compiled if things aren't type-safe.
It's the toolset as you stated that you suspect.
The readability of code to know exactly what types a given argument,
variable or array contain.
The IDE telling me as I type when errors are occuring, what objects
relate to what, navigating through code, etc.
I've attempted RubyMine, Aptana and Netbeans. They are attempting this
dynamic interpretation but are far from accomplishing it.
For example, code completion in these tools to suggest the available API
methods is almost useless, as they offer virtually every method
available under the sun, as they are not interpreting what actual type
the variable is. Therefore they'll show me 15 different versions of a
method with the same name, all for different object types from the Ruby
API.
Similarly, looking at an array or collection in Ruby does not tell me
what it is, especially if things are being passed around through
methods, across class boundaries etc. Instead of the method signature
telling me "Collection<ZebraAnimal>" I just see a variable.
Thus, I must now depend on a team of developers properly documenting
everything, using very descriptive naming conventions (and properly
refactoring all of that when changes occur), and wrapping everything in
unit tests.
Now, all of those are "ideal" cases - ones I believe in and stress
continually. I have hundreds and hundreds of unit tests and automated
build servers etc - but in the "real world", getting teams to comment
every method, properly name (and refactor) variable names and cover
everything in unit tests just doesn't happen - unless it's a small team
of very competent people who all believe in the same paradigm and treat
their code as art. I wish that's how all dev teams were, but it's not a
reality.
Perhaps if it's a personal project where I know the code and can ensure
all is covered it's a different story.
-- 100 lines of code is generally easier to read and
-- debug than a thousand.
I'll give you that - but I have yet to see anything that proves to me
that a competent developer using both Ruby and Java (or C# for that
matter) would have 10x as much written code than they would in Ruby.
The "cruft" so often referred to are things that I don't even consider
or think of. Boilerplate code ... clutter and sometimes annoying ...
fades into the background and tools remove the pain of it. And with the
advent of annotations, many of these arguments disappear when Java code
is written correctly with modern patterns and frameworks.
-- I think that particular code sample was misleading -- certainly, you
can
-- play Perl Golf in any language. But you have coding conventions in
Java, and
-- you would in Ruby.
You surely can, and I'm trying to understand what the coding conventions
are in Ruby. The book link offered is something I'm going to go look at.
Amazon referred another book called "The Ruby Way" which may also
provide me good insights. Any experience with that one?
-- ... best served by Ruby ...
-- small scripts for system administration
I completely agree here.
-- any sort of web app ... that needs to be flexible and constantly
maintained and improved,
-- for which the developer controls the hardware.
I'm leaning more and more towards this. In fact, I'm trying to figure
out how to rip Java out of my webapps completely and leave that to the
backend webservices and let the presentation layer be as free from
"code" as possible. Java developers typically aren't exactly the best at
client facing solutions (don't attack me on this if you disagree ...
this is ofcourse not a definitive rule, it's just that I find it more
challenging to hire good web developers who are 'Java' skilled as
opposed to PHP, Ruby, Javascript, CSS, etc).
For example, if I can accomplish a dynamic front-end purely driven by
client side Javascript using AJAX techniques with a REST style
webservices backend, I will try to pursue that.
The middle ground seems to be pursuing Ruby or something else that is
still server-side, but better suited to the always changing pace of
webapp dev and more creative, script driven coding style better suited
to web developers and designers.
-- desktop app that doesn't need the highest performance possible
What you say makes sense here, but I am so far removed from desktop apps
that I'm useless in weighing in on this.
-- It's about whether it's possible to throw CPUs at the problem, or
-- whether the CPU is the bottleneck at all.
Understood and I agree.
···
--
Posted via http://www.ruby-forum.com/.