So I was wondering how one would capture a continuous stream of data
(for instance a net radio station or a webcam) with the net/http
library? When I use the Net::HTTP#request_get method, it simply hangs
and never gives me any data. I opened up a network monitoring program
and indeed the data is coming in, it's just the only way I can get the
data to stop streaming is to throw an exception, which makes the data
disappear. Does anyone know of a way to do this?
So, before the `read_body` block, you can examine the headers. And
inside the `read_body` block, you can handle chunks as they come in.
In your case, you'd pass them to file.write.
_why
···
On Tue, Sep 23, 2008 at 03:24:49AM +0900, William Spitzer wrote:
So I was wondering how one would capture a continuous stream of data
(for instance a net radio station or a webcam) with the net/http
library?
So, before the `read_body` block, you can examine the headers. And
inside the `read_body` block, you can handle chunks as they come in.
In your case, you'd pass them to file.write.
_why
And don't forgot to put a "stop" condition to the loop in case of a "everlasting download". The problem is : I don't know how he can cleanly break the chunk iterations (will the socket be cleanly disconnected with an exception etc.?).
By the way, this kind of "streaming" is named "progressive download" and both have their drawbacks.
regards
--FrihD(Lucas)
···
On Tue, Sep 23, 2008 at 03:24:49AM +0900, William Spitzer wrote: