in reply to Multiple Perl files sharing a single socket - is it possible/sensible?

I apologize for the lack of clarification in my original question. Let me explain a bit further and clarify to get rid of any confusion:

This workflow will be used in a library environment. Our library is migrating to Koha, which is built on Perl and MySQL. MY task is to interface Koha (which will be hosted by a third party off-site. For the remainder of this reply I'll identify Koha as the ILS - integrated library system - and our local server called EMS) with a local server on-site (EMS) which runs a machine we use to store and pick books from (read as replacement for bookshelves).

As an example, a person would log in to the Koha library catalog and "Place Hold" on something. When they click "Place Hold" that would trigger a request file (request.pl) and collect contextual information regarding the request through MySQL extraction then relay that information via socket to EMS for the request to be completed.

EMS has been designed by a third-party to accept Socket (SOCK_STREAM) connections over TCP protocol. EMS is set up to process data in the form of single line "messages". A robust ACK/NAK mechanism is in place to guarantee success or failure of a message which is handled by EMS.

Parts of the actual interface connection are confidential, however I will try to answer the posed questions as best I can in hopes it will assist with a solid solution.

To answer NetWallah's questions:

  1. The EMS server handles incoming requests in a FIFO manner. Both sides (ILS and EMS) are to have a SEND and RECEIVE function. The EMS side already has both configured but it is up to me to create them for the ILS side.
  2. Currently, I have been trying to work with 8 separate Perl files (messages), however all 8 could potentially be combined into one if I am able to access the necessary sub as needed.
  3. Though it would not necessarily happen constantly, it is more likely than not that more than one message may be triggered simultaneously.
  4. All 8 messages would send their data to one end point (via IP Address) on only one port.
  5. The receiving end (EMS) also has a SEND and RECEIVE task.
  6. And I'm impressed that you guessed right about it being message based!
  7. Lastly (to give a clearer picture), when the ILS RECEIVE task is active, it will make a SQL query to UPDATE the database.

I hope that I provided enough clarification to clear up the confusion!

  • Comment on Re: Multiple Perl files sharing a single socket - is it possible/sensible?

Replies are listed 'Best First'.
Re^2: Multiple Perl files sharing a single socket - is it possible/sensible?
by mr_mischief (Monsignor) on Dec 02, 2015 at 15:11 UTC

    It is entirely possible for the server side to be ready for multiple client connections. I would guess EMS has no problem if all eight of your client scripts connect to it at once. Is there some order which needs to be enforced on these messages, or are they fine to arrive in any order?

      I amend my statement as far as "all 8 messages at once" since most messages only go one way. Like so:

      • Response message - RECEIVE to SEND task on both sides (ILS and EMS will handle this message - contains code notifying that message was accepted or rejected by socket)
      • Ping message - SEND to RECEIVE task (this acts as the "keep alive" mechanism) Pings are sent from both sides at 40-second intervals.
      • Add message - ILS to EMS only
      • Delete message - ILS to EMS only
      • Return message - ILS to EMS only
      • Request message - ILS to EMS only
      • Status Check message - ILS to EMS only
      • Status message - EMS to ILS only

      As a visual representation, a typical transaction would follow this order:

      ILS EMS --- --- Request --> <-- Response (request accepted/rejected) <-- Status (requested item in available/unavailable) Response -->
Re^2: Multiple Perl files sharing a single socket - is it possible/sensible?
by FreeBeerReekingMonk (Deacon) on Dec 02, 2015 at 20:35 UTC

    Too bad you already started, there are some softwares that help, for example if you can stand the way dancer2 is programmed (node.js for perl), you can have Dancer2::Plugin::Queue this will allow to receive many messages, and put into a single queue, which then can be polled to be processed.

    Then there are Message Queue in Perl and some nice abstractions in CPAN maybe worth looking into.
    I noted no encrytion requirements...

      That is correct...there is no encryption on these messages.

      I would be open to software solutions to the problem except that it would be more cumbersome than necessary to handle and maintain the MySQL data extraction AND the messages socket. I would be able to do it if we were hosting this Koha instance locally, but it will be hosted by a third-party.