Chris Markle wrote:
Since I am HTTP PUT'ing large amounts of data (ideally using Net::HTTP)
I would like to feed data in some way to an HTTP PUT so that I don't
have to have the entire amount of data in memory before using methods
like request or send_request to initiate the PUT. I saw one post that
suggests it can't be done but I thought I'd check again.
Best way to understand this is to go to the source code - probably
somewhere like /usr/lib/ruby/1.8/net/http.rb on your system.
def put(path, data, initheader = nil) #:nodoc:
res = request(Put.new(path, initheader), data)
res.value unless @newimpl
res
end
Trace it through, this calls req.exec to send the request.
def exec(sock, ver, path) #:nodoc: internal use only
if @body
send_request_with_body sock, ver, path, @body
elsif @body_stream
send_request_with_body_stream sock, ver, path, @body_stream
else
write_header sock, ver, path
end
end
Looking at send_request_with_body_stream, you can pass any object which
responds to read(n), such as an open IO stream. If you set
"Tranfer-Encoding: chunked" then you'll get it posted in chunks too,
which simplifies buffering at the receiving side (if it also supports
chunking)
But it looks like you may need to create your Request object by hand and
call #body_stream=, because set_body_internal doesn't understand about
streaming objects. Perhaps something like this (untested):
req = Net::HTTP::Put.new(path)
req.body_stream = File.open("/path/to/large/file")
res = Net::HTTP.start(url.host, url.port) {|http|
http.request(req)
}
HTH,
Brian.
···
--
Posted via http://www.ruby-forum.com/\.