Folks,
I have a Ruby application that allows users fairly arbitrary access to
URLs. The application then opens and reads the object at that URL
using Net::HTTP.get() and attempts to parse it as an XML document.
So far so good.
But this is a Rails app open to the general public, and it would be
fairly trivial to write a CGI somewhere that just returns garbage data
forever, leaving open a pretty obvious DoS attack.
I'd like to specify a maximum number of bytes to read with
Net::HTTP.get(), so, for example, if the process had read more than 1MB
it would throw an exception and stop reading. I haven't been able to
find a way to do that so far, but then I confess to being fairly new to
Ruby.
Does anyone have any ideas or pointers?
-Seth