swkronenfeld has asked for the wisdom of the Perl Monks concerning the following question:
I wrote (or rather, adapted an existing) server skeleton, which accepts new connections, and creates input and output buffers for each new connection. Currently the output buffer for each user is a single string.
In my server, I get the users input, and send it on to another file to prepare the appropriate output. That file generates the output and returns it to the server, which places it in the outgoing buffer. All pretty standard, right?
So my question: the output for each request is generated in chunks, sometimes as large as a few kilos per chunk. Each request tends to have at least 2 chunks generated, and as many as 5. Currently, to add the new chunk to the existing output, I do
$outBuffer .= $newChunk
Is this copying of strings incredibly inefficient, as it seems to me that it might be? Would I be better off pushing each chunk of output onto an outgoing array. I assume that an array would be more memory inefficient, but speed is more important. Are there other issues/solutions I haven't thought of?
(It's a pretty significant change to my program, so I tried running the benchmark module on a small example I contrived. Alas, my company doesn't have perl configured correctly to run a lot of modules, benchmark being one of those)