[NBLUG/talk] Multiple writes to the same file.

Eric Eisenhart eric at nblug.org
Thu Dec 23 11:00:49 PST 2004


On Thu, Dec 23, 2004 at 09:29:45AM -0800, Walter Hansen wrote:
> Hmmm. I've got 50 process I currently run (perl). Last night they produced
> 50 log files. It's getting annoying to read 50 log files (I started with 2
> process and will probably end up with 200 or so). So today I tried an
> experiment and hooked them all up to the same file and ran them. It worked
> very nicely although I'll obviously have to identify the process
> statements in the log. I'm redirecting STDOUT and STDERR to this file for
> all 50. Is this bad? Can it screw up? This may well make my job a lot
> easier if it does work. Oh, I didn't find much on google about this, but
> thought you guys might have an opinion.

As long as:

a) each process writes one entire line at a time
b) filehandles are in "append" mode
c) lines are under 4096 bytes (under 512 to be safe across platforms) long,
   including the newline at the end
d) You've done "$| = 1;" to force autoflush on

the technique you describe should work fine.

More specifically, writes to a filehandle opened in append mode (or to a
pipe) of under PIPE_BUF bytes in size are guaranteed to be atomic in the
POSIX specification.  Setting $| to 1 in perl tells perl that you want every
print to translate into an actual write immediately.  (normally it'll save
them up until it gets a whole line or a whole block, depending on whether it
thinks it's talking to a terminal or not)

(in other words, it's guaranteed to work)
-- 
Eric Eisenhart
NBLUG Co-Founder & Scribe
The North Bay Linux Users Group
http://nblug.org/
eric at nblug.org, IRC: Freiheit at freenode, AIM: falschfreiheit, ICQ: 48217244




More information about the talk mailing list