The economics of a slow but productive Ruby

[NOTE: I'm trying to present the facts and be objective in this post.
I love Ruby, and would choose it any day when economics didn't matter.
But in the sense of the "Real World", this is what I discovered. And
of course, if I made any serious mistakes, be sure to let me know!]

Company QUUX is deciding on technologies for a new project. They
estimate a development budget of A and a hardware budget of B under
technology BAR:

  development budget under BAR = A
  hardware budget under BAR = B
  total budget under BAR = A + B

They are also considering using technology FOO as well. FOO is widely
reputed to grant productivity gains of a factor Y, but is slower than
BAR, requiring X times the servers. FOO developers make about Z times
as much as BAR developers, on average:

  X = servers required under FOO / servers required under BAR
  Y = productivity FOO / productivity BAR
  Z = annual FOO salary / annual BAR salary

The development budget under FOO would be reduced by the productivity
increase, but that increase will be mitigated by the difference in
salary:

  development budget under FOO = AZ/Y

The hardware budget under FOO would be increased by the factor X:

  hardware budget under FOO = BX

The total budget under FOO, in terms of the budget under BAR, would then be:

  total budget under FOO = AZ/Y + BX

Given these estimates, it would be a profitable decision to choose FOO
over BAR if and only if the total budget under FOO is less than the
total budget under BAR.

  choose FOO iff AZ/Y + BX < A + B -- or, rearranging...
  choose FOO iff (X - 1)B < (1 - Z/Y)A
  choose FOO iff [(X - 1) + (1 - Z/Y)]B < (1 - Z/Y)(A + B)
  choose FOO iff B < [(1 - Z/Y) / (X - Z/Y)](A + B)

Let's apply this estimate to the current standing between .NET and
Ruby/Rails, using the figures from Joel (X = 5, Y = 5). In this case,
Z = 1 (actually, in my comparisons, Z was slight *less* than one).

  (1 - Z/Y) / (X - Z/Y)
  = (1 - 1/5) / (5 - 1/5)
  = (4/5) / (24/5)
  = 4 / 24
  = 1 / 6

So, choosing Ruby of .NET (assuming Joel's numbers are correct) is
economically sound iff your hardware budget makes up 1/6th or less of
the total estimated .NET budget.

Now, let's assume 20 servers and a 5 year application lifespan, with a
$5K one-time cost per server, $500 annually for repairs and one
Sysadmin with a salary comparable to the developers ($60K). This
brings our hardware budget to $450K over the 5 years[1]. If this is
only 1/6 the total budget, we need to be spending at least 5 times as
much on developers, or exactly that amount per year. Using the same
$60K figure for developer salaries, this comes to 7.5 developers. So,
if you're developer to server ratio is at least 3 developers for each
production server, Ruby is probably economical. If you start getting a
lot more servers than developers however, the hardware cost of a slow
Ruby builds up on you.

Jacob Fugal

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

1) It doesn't take 5 times more boxes for a ruby app than a .NET app, the single biggest factor in efficiency is the quality of the developer. You can do many things on the code level to optimize server CPU. I've never found it to be an issue. Honestly, if you need 5 times more servers to run a ruby on rails app than a .NET app, I'll have to laugh.

As an example, I worked for a company that developed a PHP app and it took 15 application servers to run it when it should have taken 5. It took that many because the coding (before I was hired) was terrible. The same can happen with any technology.

2) Network latency is a far bigger bottleneck than CPU. All technologies face the same problem.

3) Joel pulled that number out of his ass, I mean, I could say that the same app coded in .NET would take 2834 servers where as it would take a 3 year old palm using ruby. That doesn't make it true.

4) I didn't see any factor for software budget.

-carl

···

On Sep 11, 2006, at 5:56 PM, Jacob Fugal wrote:

[NOTE: I'm trying to present the facts and be objective in this post.
I love Ruby, and would choose it any day when economics didn't matter.
But in the sense of the "Real World", this is what I discovered. And
of course, if I made any serious mistakes, be sure to let me know!]

Company QUUX is deciding on technologies for a new project. They
estimate a development budget of A and a hardware budget of B under
technology BAR:

development budget under BAR = A
hardware budget under BAR = B
total budget under BAR = A + B

They are also considering using technology FOO as well. FOO is widely
reputed to grant productivity gains of a factor Y, but is slower than
BAR, requiring X times the servers. FOO developers make about Z times
as much as BAR developers, on average:

X = servers required under FOO / servers required under BAR
Y = productivity FOO / productivity BAR
Z = annual FOO salary / annual BAR salary

The development budget under FOO would be reduced by the productivity
increase, but that increase will be mitigated by the difference in
salary:

development budget under FOO = AZ/Y

The hardware budget under FOO would be increased by the factor X:

hardware budget under FOO = BX

The total budget under FOO, in terms of the budget under BAR, would then be:

total budget under FOO = AZ/Y + BX

Given these estimates, it would be a profitable decision to choose FOO
over BAR if and only if the total budget under FOO is less than the
total budget under BAR.

choose FOO iff AZ/Y + BX < A + B -- or, rearranging...
choose FOO iff (X - 1)B < (1 - Z/Y)A
choose FOO iff [(X - 1) + (1 - Z/Y)]B < (1 - Z/Y)(A + B)
choose FOO iff B < [(1 - Z/Y) / (X - Z/Y)](A + B)

Let's apply this estimate to the current standing between .NET and
Ruby/Rails, using the figures from Joel (X = 5, Y = 5). In this case,
Z = 1 (actually, in my comparisons, Z was slight *less* than one).

(1 - Z/Y) / (X - Z/Y)
= (1 - 1/5) / (5 - 1/5)
= (4/5) / (24/5)
= 4 / 24
= 1 / 6

So, choosing Ruby of .NET (assuming Joel's numbers are correct) is
economically sound iff your hardware budget makes up 1/6th or less of
the total estimated .NET budget.

Now, let's assume 20 servers and a 5 year application lifespan, with a
$5K one-time cost per server, $500 annually for repairs and one
Sysadmin with a salary comparable to the developers ($60K). This
brings our hardware budget to $450K over the 5 years[1]. If this is
only 1/6 the total budget, we need to be spending at least 5 times as
much on developers, or exactly that amount per year. Using the same
$60K figure for developer salaries, this comes to 7.5 developers. So,
if you're developer to server ratio is at least 3 developers for each
production server, Ruby is probably economical. If you start getting a
lot more servers than developers however, the hardware cost of a slow
Ruby builds up on you.

Jacob Fugal

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

Also note that the values I used here a pretty conservative. As many
have mentioned, Ruby will often not be the bottleneck -- X can be less
than 5. Also, depending on your programmers, Y may be more or less
than 5. Doing the calculation with X = 2 and Y = 10 yields much more
favorable results:

  (1 - Z/Y) / (X - Z/Y)
  = (1 - 1/10) / (2 - 1/10)
  = (9/10) / (19/10)
  = 9 / 19
  = 47%

So under optimistic cases, Ruby will still be economical until
hardware eats up *half* your budget. Or, pessimistically, let's try X
= 10, Y = 2:

  (1 - Z/Y) / (X - Z/Y)
  = (1 - 1/2) / (10 - 1/2)
  = (1/2) / (49/2)
  = 1/49

You're hardware budget would need to be negligible under those
circumstances to make Ruby economical.

Fortunately, in my experience, X has never even approached 5, let
alone 10. And Y has always been good to me. The important thing is
that for *your* decision, you need to:

1) Evaluate what X is *for your application*
2) Evaluate what Y you will believe
3) Know how your hardware costs will scale (see the footnote in my
original email)

All these factors will affect the outcome greatly.

Jacob Fugal

···

On 9/11/06, Jacob Fugal <lukfugl@gmail.com> wrote:

  choose FOO iff B < [(1 - Z/Y) / (X - Z/Y)](A + B)

Let's apply this estimate to the current standing between .NET and
Ruby/Rails, using the figures from Joel (X = 5, Y = 5). In this case,
Z = 1 (actually, in my comparisons, Z was slight *less* than one).

Howdy folks. As a top notch sysadmin, I just wanted to rimind y'all that I'm out here.

-- Matt
It's not what I know that counts. It's what I can remember in time to use.

···

On Tue, 12 Sep 2006, Jacob Fugal wrote:

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

It also helps if you're using a system that has a lower
admins-to-servers requirement ratio. As indicated by recent studies,
Linux and Solaris both require far fewer admins for the number of boxen
than Windows:

  http://www.cioupdate.com/article.php/10493_1477911

From the article:

  Linux, along with Solaris, also came out ahead of Windows in terms of
  administration costs, despite the fact that it's less expensive to
  hire Windows system administrators. The average Windows administrator
  in the study earned $68,500 a year, while Linux sys admins took home
  $71,400, and those with Solaris skills were paid $85,844. The Windows
  technicians, however, only managed an average of 10 machines each,
  while Linux or Solaris admins can generally handle several times that.

This, like the number of servers required for a given software project,
does not scale linearly -- but the scalability of Windows systems in
terms of administrative requirements never overtakes that of Solaris and
Linux systems (except possibly in pathological edge-cases).

···

On Tue, Sep 12, 2006 at 09:56:58AM +0900, Jacob Fugal wrote:

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
Ben Franklin: "As we enjoy great Advantages from the Inventions of
others we should be glad of an Opportunity to serve others by any
Invention of ours, and this we should do freely and generously."

Jacob Fugal wrote:

[1] It's interesting to note however that 67% of that figure is still
in paid salaries, rather than the cost of the hardware itself. If
you've got a super sysadmin who can manage 100 boxes (and you better
be paying them at least 80K if they are that super), the hardware
budget will scale a lot better. There's a lot to be said for getting
your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

<ducking>

It's rarely a matter of economics and statistics. Essentially there is
very little good scientific data on the relative merits of different
development systems. Practically all you can find was done on
productivity and sociology in development was done in the 70s and early
80s.

Beyond that it is really a matter of faith.

Jacob Fugal wrote:

···

[NOTE: I'm trying to present the facts and be objective in this post.
I love Ruby, and would choose it any day when economics didn't matter.

I agree, but I was using the numbers from Joel's article. See my
follow up email for a little more detail on what I believe it would
*really* be...

My *main* point in the original email is that there *is* a line where
throwing more servers at it isn't economical. Where that line is
depends a great deal on your individual situation.

Jacob Fugal

···

On 9/11/06, Carl Lerche <carl.lerche@verizon.net> wrote:

1) It doesn't take 5 times more boxes for a ruby app than a .NET app,
the single biggest factor in efficiency is the quality of the
developer. You can do many things on the code level to optimize
server CPU. I've never found it to be an issue. Honestly, if you need
5 times more servers to run a ruby on rails app than a .NET app, I'll
have to laugh.

Do you suggest they should use a slower scripting language, like batch
files? It's not like sysadmins write their administrative scripts in
assembly language for performance.

···

On Tue, Sep 12, 2006 at 12:30:05PM +0900, M. Edward (Ed) Borasky wrote:

Jacob Fugal wrote:

>
> [1] It's interesting to note however that 67% of that figure is still
> in paid salaries, rather than the cost of the hardware itself. If
> you've got a super sysadmin who can manage 100 boxes (and you better
> be paying them at least 80K if they are that super), the hardware
> budget will scale a lot better. There's a lot to be said for getting
> your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
"A script is what you give the actors. A program
is what you give the audience." - Larry Wall

Which is significant. I don't know the cost of licenses for Windows
servers, but I imagine it is costly, not to mention things like
development tools... Actually, I imagine a .Net project could greatly
exceed its hardware costs in software costs, given the right
circumstances.

···

On 9/11/06, Carl Lerche <carl.lerche@verizon.net> wrote:

4) I didn't see any factor for software budget.

As others have mentioned, the scripts used by sysadmins aren't really
CPU intensive nor too dependent on the host language speed. So it's a
moot point. :slight_smile:

Jacob Fugal

···

On 9/11/06, M. Edward (Ed) Borasky <znmeb@cesmail.net> wrote:

Jacob Fugal wrote:
> [1] It's interesting to note however that 67% of that figure is still
> in paid salaries, rather than the cost of the hardware itself. If
> you've got a super sysadmin who can manage 100 boxes (and you better
> be paying them at least 80K if they are that super), the hardware
> budget will scale a lot better. There's a lot to be said for getting
> your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

I'll actually agree with you here as well. My email actually started
out in a much different direction, which is when I wrote that
disclaimer. The disclaimer escaped editing however when the body of
the email changed. When I refer to "objective facts", I am referring
to the derivation of the equation in general, not specific values.

When I started the email I never intended to delve into placing
specific values on X and Y. My primary intent was just to examine the
equation that could result in the abstract. This was to demonstrate
that a boolean choice can be made, but *only* if the party making the
choice is willing to decide on values for X and Y (and Z).

Values for X can be locked down for any given choice. The important
thing is that the X you use be related to your situation. There is no
one size fits all X -- in some cases X will be significant (number
crunching) in others it might be nearly 1. That determination needs to
be made on a per project basis.

Values for Y on the other hand are very subjective. As you said, there
is very little if any hard scientific data (in the form of published
studies) to support any value of Y. Most of what we have is anecdotal.
*But*, if a decision maker has experienced some of those anecdotes
him/herself, or is willing to accept the judgment of another in
evaluating those anecdotes, they can determine a Y value that they are
willing to believe for the sake of the decision.

My purpose was never to propose that certain values of X or Y are
correct, but rather to provide a framework equation inside which
different values of X and Y can be examined.

Jacob Fugal

···

On 9/12/06, Neil Wilson <aldursys@gmail.com> wrote:

It's rarely a matter of economics and statistics. Essentially there is
very little good scientific data on the relative merits of different
development systems. Practically all you can find was done on
productivity and sociology in development was done in the 70s and early
80s.

Neil Wilson wrote:

It's rarely a matter of economics and statistics. Essentially there is
very little good scientific data on the relative merits of different
development systems. Practically all you can find was done on
productivity and sociology in development was done in the 70s and early
80s.

Beyond that it is really a matter of faith.

Or in this case, mathturbation.

Seriously though, I can't wait until I can put a few numbers into a
computer and have it fart out all my decisions and actions for me.

I realize that you are using the numbers from Joel's article, but (and maybe it's just me), those numbers are just so absurd, they don't merit any more discussion than "that's absurd" and maybe point out why using real world situations.

Also, yes, there are some extreme cases... such as the google search engine. However, scaling is not linear. Hypothetically, IF at a certain point a .NET web-application takes 5 servers and a similar ruby web-application takes 25 servers (this already sounds a bit ridiculous, but allow me to continue...). This does NOT mean that when this .NET application requires 50 servers to run that the similar ruby web-app will require 250.

As such, I don't see where this line would be, not using your method of proving that there is a line.

And lastly, if there are any developers that develop Ruby apps for a company that requires 5 times as many servers as an equivalent .NET app.. they should be fired :stuck_out_tongue:

-carl

···

On Sep 11, 2006, at 6:21 PM, Jacob Fugal wrote:

On 9/11/06, Carl Lerche <carl.lerche@verizon.net> wrote:

1) It doesn't take 5 times more boxes for a ruby app than a .NET app,
the single biggest factor in efficiency is the quality of the
developer. You can do many things on the code level to optimize
server CPU. I've never found it to be an issue. Honestly, if you need
5 times more servers to run a ruby on rails app than a .NET app, I'll
have to laugh.

I agree, but I was using the numbers from Joel's article. See my
follow up email for a little more detail on what I believe it would
*really* be...

My *main* point in the original email is that there *is* a line where
throwing more servers at it isn't economical. Where that line is
depends a great deal on your individual situation.

Jacob Fugal

The same what can be said about Ruby here can also be said about
Python, even if Python would be a tiny bit faster. However I feel
that Python - with all its quirks - is "more advanced" in terms of
being used or accepted in companies.
The world isn't a monoculture, different qualities (of technologies)
can coexist for a long time.

The bottleneck for the money part still seems to be the
sysadmin no matter which language or? =)

···

--
Posted via http://www.ruby-forum.com/.

I don't have current figures, but you're right -- server licenses are
far beyond hardware costs with Windows, especially when including MS
software for development, for the framework, et cetera.

···

On Tue, Sep 12, 2006 at 12:38:29PM +0900, Gregory Brown wrote:

On 9/11/06, Carl Lerche <carl.lerche@verizon.net> wrote:

>4) I didn't see any factor for software budget.

Which is significant. I don't know the cost of licenses for Windows
servers, but I imagine it is costly, not to mention things like
development tools... Actually, I imagine a .Net project could greatly
exceed its hardware costs in software costs, given the right
circumstances.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
Amazon.com interview candidate: "When C++ is your
hammer, everything starts to look like your thumb."

Yeah . . . premature eja^H^H^Hoptimization, evil, et cetera.

···

On Tue, Sep 12, 2006 at 01:50:10PM +0900, Jacob Fugal wrote:

On 9/11/06, M. Edward (Ed) Borasky <znmeb@cesmail.net> wrote:
>Jacob Fugal wrote:
>> [1] It's interesting to note however that 67% of that figure is still
>> in paid salaries, rather than the cost of the hardware itself. If
>> you've got a super sysadmin who can manage 100 boxes (and you better
>> be paying them at least 80K if they are that super), the hardware
>> budget will scale a lot better. There's a lot to be said for getting
>> your hands on a good sysadmin...
>
>Ah, but does SuperSysAdmin have to use a slow scripting language?

As others have mentioned, the scripts used by sysadmins aren't really
CPU intensive nor too dependent on the host language speed. So it's a
moot point. :slight_smile:

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
"There comes a time in the history of any project when it becomes necessary
to shoot the engineers and begin production." - MacUser, November 1990

i wouldn't agree with that. we just started a trio of ruby scripts running
across three machines which coordinated processing of 30 satellite years of
data. they are going to peg the machines near 50% for 3-5 months. it may not
be common but, where huge data sets are involved, something as trivial as
'unpacking and moving some data' may take a considerable amount of logic and
cpu.

2 cts.

-a

···

On Tue, 12 Sep 2006, Jacob Fugal wrote:

On 9/11/06, M. Edward (Ed) Borasky <znmeb@cesmail.net> wrote:

Jacob Fugal wrote:
> [1] It's interesting to note however that 67% of that figure is still
> in paid salaries, rather than the cost of the hardware itself. If
> you've got a super sysadmin who can manage 100 boxes (and you better
> be paying them at least 80K if they are that super), the hardware
> budget will scale a lot better. There's a lot to be said for getting
> your hands on a good sysadmin...

Ah, but does SuperSysAdmin have to use a slow scripting language?

As others have mentioned, the scripts used by sysadmins aren't really
CPU intensive nor too dependent on the host language speed. So it's a
moot point. :slight_smile:

--
in order to be effective truth must penetrate like an arrow - and that is
likely to hurt. -- wei wu wei

This is interesting as far as it goes, but the challenge I have with
it is that I'm not sure the things you propose to measure are
necessarily that important in the economics of many projects (and
"economics" appears in your thread title).

We programmers tend to fantasize that what we do is the most important
part of any effort that involves software development but in many
organizations that's far from true. In particular I would guess that
there is more sensitivity to machine requirements since they tend to
involve capex and often represent a variable cost component in any
project. You may find this bizarre, but many companies view
programmers as human resources, which carry a wide range of
incremental costs going far beyond their variable contributions to any
particular project. If you make the commitment to hire a person as an
employee, it's very difficult for a whole range of reasons to let the
person go. From that point of view, a programmer is a lot like a fixed
cost. Their availability may constrain the number of projects you
attempt but not necessarily their direct cost, as it impacts your
choice of development platform. Outsourcing or using consultants
changes this calculus, but you still have to provision maintenance and
support programmers.

I would guess (though I have no data to back this up) that Ruby finds
far more acceptance in projects where the developers themselves are
the ones making the financial (or time) commitment. Under those
circumstances it's a no-brainer to use a development system that 1)
you dearly love, and 2) you are convinced will save you a lot of time,
and 3) you're convinced will let you get the job done with fewer
people. Many people who fund corporate projects couldn't care less
about 1), and think that 2) and 3) are mutually exclusive.

···

On 9/12/06, Jacob Fugal <lukfugl@gmail.com> wrote:
> My purpose was never to propose that certain values of X or Y are

correct, but rather to provide a framework equation inside which
different values of X and Y can be examined.

42

···

On 9/12/06, Seth Thomas Rasmussen <sethrasmussen@gmail.com> wrote:

Seriously though, I can't wait until I can put a few numbers into a
computer and have it fart out all my decisions and actions for me.