Multiple processes logging to single file

When using the native ruby logger, is it save to have multiple
processes writing to the same log file?

For instance, I have the following ruby source example:

# test.rb
require 'logger'

log = Logger.new('/tmp/RLOG')

20.times do
  log.error "PID: #{$$}"
  sleep 1
end

log.close
# end test.rb

Then I execute like so:

$ ruby test.rb & ruby test.rb & ruby test.rb

After inspection of the log file, each process logged 20 times, all
messages interleaved of course. It looks like it works, but am I
delusional? I'm on an OpenBSD box. Does the OS handle caching/writing
from multiple sources?

-pachl

pachl wrote:

After inspection of the log file, each process logged 20 times, all
messages interleaved of course. It looks like it works, but am I
delusional? I'm on an OpenBSD box. Does the OS handle caching/writing
from multiple sources?

Yes, the OS should manage that (it is file-system level, rather than
userspace level). However, you can tweak your code to have
'transactional' logs (i.e. that every application logs a meaningful
chunk of information in one go).

Anyway, the file system and drivers take care of the actual writing.

Though, having everything log into one logfile *may* result in fractured
files, or odd race conditions and/or deadlocks. It might be saner to log
each process to its own file.

- --
Phillip Gawlowski
Twitter: twitter.com/cynicalryan
Blog: http://justarubyist.blogspot.com

Don't over-comment.
~ - The Elements of Programming Style (Kernighan & Plaugher)