Hi all:
I had written a ruby program for fetching webpage from a batch of urls.
But as the program ran for a while, it took a lot of memory, which lead to
killed by the system unexpectedly. I want to debug this problem. Are there
some amazing tools for me to see the all the objects(including the object's
type) which was not released by GC when the program running.
Any suggestions are welcome.
Hi
I dont know these tools yet,
but have seen that some of the gems use enormous much memory (which seems
to be "normal"), maybe because the whole website is being parsed.
I have seen many -g flags when compiling the sources, so gdb could help
(but thats the hard way)
Another thing is ruby-debug, but I haven't used it yet.
what do you do in your program?
(fetching only shouldnt use much memory)
Bye
Berg
···
Am 14.09.2016 07:36 schrieb "timlen tse" <tinglenxan@gmail.com>:
Hi all:
I had written a ruby program for fetching webpage from a batch of
urls. But as the program ran for a while, it took a lot of memory, which
lead to killed by the system unexpectedly. I want to debug this problem.
Are there some amazing tools for me to see the all the objects(including
the object's type) which was not released by GC when the program running.
Any suggestions are welcome.
I can't vouch for anything there, BTW; it was just a search off the top of
my head.
Cheers
···
On 14 September 2016 at 15:35, timlen tse <tinglenxan@gmail.com> wrote:
Hi all:
I had written a ruby program for fetching webpage from a batch of
urls. But as the program ran for a while, it took a lot of memory, which
lead to killed by the system unexpectedly. I want to debug this problem.
Are there some amazing tools for me to see the all the objects(including
the object's type) which was not released by GC when the program running.
Any suggestions are welcome.
I think it shouldn't be, but it did.
In my program , I iterate through a table(named urls) using
ActiveRecord,get it's url(a field) and fetch webpage using HTTParty and
parse the webpage using Nokogiri then extract target tag content and store
into database.
A Berger <aberger7890@gmail.com>于2016年9月14日周三 下午2:48写道:
···
Hi
I dont know these tools yet,
but have seen that some of the gems use enormous much memory (which seems
to be "normal"), maybe because the whole website is being parsed.
I have seen many -g flags when compiling the sources, so gdb could help
(but thats the hard way)
Another thing is ruby-debug, but I haven't used it yet.
what do you do in your program?
(fetching only shouldnt use much memory)
Bye
Berg
Am 14.09.2016 07:36 schrieb "timlen tse" <tinglenxan@gmail.com>:
Hi all:
I had written a ruby program for fetching webpage from a batch of
urls. But as the program ran for a while, it took a lot of memory, which
lead to killed by the system unexpectedly. I want to debug this problem.
Are there some amazing tools for me to see the all the objects(including
the object's type) which was not released by GC when the program running.
Any suggestions are welcome.
Can you show us your code (you can use pastebin http://pastebin.com/\),
maybe there's a bottleneck in it we could identify ?
···
On Wed, Sep 14, 2016, at 09:35, timlen tse wrote:
I think it shouldn't be, but it did.
In my program , I iterate through a table(named urls) using
ActiveRecord,get it's url(a field) and fetch webpage using HTTParty
and parse the webpage using Nokogiri then extract target tag content
and store into database.
A Berger <aberger7890@gmail.com>于2016年9月14日周三 下午2:48写道:
Hi
I dont know these tools yet,
but have seen that some of the gems use enormous much memory (which
seems to be "normal"), maybe because the whole website is being
parsed.
I have seen many -g flags when compiling the sources, so gdb could
help (but thats the hard way)
Another thing is ruby-debug, but I haven't used it yet.
what do you do in your program?
(fetching only shouldnt use much memory)
Bye
Berg
Am 14.09.2016 07:36 schrieb "timlen tse" <tinglenxan@gmail.com>:
Hi all:
I had written a ruby program for fetching webpage from a batch
of urls. But as the program ran for a while, it took a lot of
memory, which lead to killed by the system unexpectedly. I want
to debug this problem. Are there some amazing tools for me to
see the all the objects(including the object's type) which was
not released by GC when the program running.
Any suggestions are welcome.
if @last
counts.keys.sort_by {|c| c.name || c.inspect}.each do |c|
diff = counts[c] - @last[c]
printf "%-30s %20d\n", c, diff if diff != 0
end
end
@last = counts
self
end
end
md = MemDiff.new
md.dump
10.times.map &:to_s
md.dump
This is of course not a real memory debugger as it won't give you
allocation site. But the type of object that is increasing might give
you an indication.
Can you show us your code (you can use pastebin http://pastebin.com/\),
maybe there's a bottleneck in it we could identify ?
On Wed, Sep 14, 2016, at 09:35, timlen tse wrote:
I think it shouldn't be, but it did.
In my program , I iterate through a table(named urls) using
ActiveRecord,get it's url(a field) and fetch webpage using HTTParty and
parse the webpage using Nokogiri then extract target tag content and store
into database.
A Berger <aberger7890@gmail.com>于2016年9月14日周三 下午2:48写道:
Hi
I dont know these tools yet,
but have seen that some of the gems use enormous much memory (which seems
to be "normal"), maybe because the whole website is being parsed.
I have seen many -g flags when compiling the sources, so gdb could help
(but thats the hard way)
Another thing is ruby-debug, but I haven't used it yet.
what do you do in your program?
(fetching only shouldnt use much memory)
Bye
Berg
Am 14.09.2016 07:36 schrieb "timlen tse" <tinglenxan@gmail.com>:
Hi all:
I had written a ruby program for fetching webpage from a batch of
urls. But as the program ran for a while, it took a lot of memory, which
lead to killed by the system unexpectedly. I want to debug this problem.
Are there some amazing tools for me to see the all the objects(including
the object's type) which was not released by GC when the program running.
Any suggestions are welcome.