Here is my experience using mod_ruby and making a persistent connexion.
Testing where the time is consumed when using mod_ruby,postgresql I
detected the folowing bottlenecks on an AMD 500MHZ.
Postgresql connection on each request 60ms.
CGI instantiation 60 ms.
processing the page template 60 ms.
The postgresql connection was fixed adding:
RubyAddPath '/home/mingo/apache/dadlib’
RubyRequire ‘/home/mingo/apache/dadlib/localsettings’
to the httpd.conf in the mod_ruby section:
require 'postgres’
module Apache
class RubyRun
@@pg_conn_handler = nil
def RubyRun.pg_conn
if !@@pg_conn_handler
@@pg_conn_handler = PGconn.connect(nil, #pghost
nil, #pgport
nil, #pgoptions
nil, #pgtty
MAIN_DB #database
)
end
@@pg_conn_handler
end
end
end
the above in the localsettings.rb file to add a class variable to
Apache::RubyRun to hold the postgresql connection handler and
referencing that field when a PGconn handler is needed.
In CGI.rb comment the following in the Cookie class declaration:
class Cookie #< SimpleDelegator
#super(@value) #60 ms was consumed here
If you use CGI::session like me change the following line in
cgi/session.rb:
unless id
id, = request[session_key]
unless id
#id, = request.cookies[session_key] #original line
id = request.cookies[session_key].value[0] #modified line
end
And test your pages to see how fast they are now.
I use the following to see how long my pages take, at the top of the
template:
$start_t = Time.now
And at the bottom of the template:
cgi.out() { processed_template + "}
The time consume in the Cookies initialization is mainly in de
Delegation class, probably is a good idea to implement it in a
different way because the penaulty in the time consumed to proccess a
template is so huge.
With the modifications above my application seems to run fine but
consuming less than half the time consumed before.
Well folks that is it, I hope it can help someone.
Sorry but I’ve got an error using cgi/session when there is no cookie
already seted in the browser to correct it do the following in
cgi/session initialize:
···
id, = request[session_key]
unless id
#id, = request.cookies[session_key] # original line
#comment the above line and add the lines bellow to next comment
begin
id = request.cookies[session_key].value[0] #DAD
rescue NameError
id = nil
end
#end new code
end
As well looking at delegate.rb I could see that doing a small change
we can improve the performance of it (but still is too much for me, it
decreases the time spent on it 25%).
In Delegate::initialize
#add the line bellow to store the methods definitions and
skip call eval one time for each one
eval_methods = ''
for method in obj.methods
next if preserved.include? method
here store the methods definition all together to evaluate
all once at the end
eval_methods << <<-EOS
def self.#{method}(*args, &block)
begin
__getobj__.__send__(:#{method}, *args, &block)
rescue Exception
$@.delete_if{|s| /:in `__getobj__'$/ =~ s} #`
$@.delete_if{|s| /^\\(eval\\):/ =~ s}
raise
end
end
EOS
end
and here outside the loop evaluate all methods at once
with around 25% less cpu time
eval(eval_methods)
The above could make a general performance improvement in ruby
programs that relay heavilly on delegation.
Again I hope that this will help someone.